Disabled folks who rely on certain public-support programs often live in abject fear of nurses and social workers. I know I do.
We’re afraid that some nurse or social worker hired by the state to conduct eligibility determinations is going to suddenly determine that our support should be reduced or eliminated.
This sinister use of algorithms is still prominent—and it’s not just in the conservative, Republican South.
Now a report just put out by the Center for Democracy and Technology, which calls itself a twenty-five-year-old nonprofit, non-partisan organization working to promote democratic values by shaping technology policy and architecture, says disabled folks like me don’t have to fear being cut off by nurses and social workers so much anymore.
Rather, it says, we now have to fear being cut off by algorithms.
The report is entitled “Challenging the Use of Algorithm-driven Decision-making in Benefits Determinations Affecting People with Disabilities.” It says, “An increasing number of states are turning to more automated algorithm-driven assessment and decision-making, relying on tools that quickly process multiple data inputs to evaluate whether a person needs assistance and how much they should receive. . . . Reports from people on the ground confirm that the tools are frequently reducing and denying benefits, often with unfair and inhumane results.”
So now, instead of having a human doing the dirty work of deciding which disabled person deserves to lose benefits, a computer does it. One example of this cited in the report concerns Bradley Ledgerwood of rural Arkansas.
Ledgerwood has cerebral palsy and is active in his community to the point where he served as alderman in the town of Cash. The reason Ledgerwood has been able to get around so well is a state Medicaid program pays the wages for people to assist him in his home for up to fifty-six hours a week. An actual human nurse authorized that many hours of support for Ledgerwood.
But in 2016, after the state started using an algorithm program to assess eligibility, Ledgerwood’s hours were cut to thirty-two a week, even though his needs hadn’t changed.
And Ledgerwood sure isn’t the only one. In the same year, 47 percent of those using the same in-home assistance program in Arkansas had their hours reduced.
Fortunately, Ledgerwood took legal action and had his hours restored. The report discusses many cases where disabled folks in other states were cut off by algorithms and sued. But not everyone prevailed.
This sinister use of algorithms is still prominent—and it’s not just in the conservative, Republican South. As of the end of last year, the report says, the Department of Health Care Finance in Washington, D.C., hired a private company to implement an algorithm-driven assessment tool to make eligibility determinations for an in-home assistance program for seniors and disabled folks.
“As soon as the tool went into effect, hundreds of disabled people and older people saw drastic cuts in their home care hours, creating gaps for people who otherwise depended on consistently available care. Others found their eligibility terminated after reassessment,” the report says.
I can see why some governors and legislators would be enamored with using algorithms to cut support, with no due process or anything, for people using programs like these. It’s a cold, gutless, heartless thing to do and that’s the strength of algorithms. They just do their job and they don’t care who they hurt.
And so the humans in high places who want to chop these programs to pieces can do so in a way that is ruthlessly efficient but draws little public attention to the cruelty of it all. That makes it a whole lot easier to accomplish that mission.