Individual Submission Summary
Share...

Direct link:

Discretionary Decision-Making and Inequality in a Medicaid Eligibility Algorithm

Mon, August 11, 8:00 to 9:30am, East Tower, Hyatt Regency Chicago, Floor: Ballroom Level/Gold, Grand Hall G

Abstract

Despite their reputation as less biased than human beings, algorithms can reproduce social inequality in resource allocation and decision-making. Research on algorithmic inequality focuses on how algorithms themselves are biased or use data that reflect entrenched inequality. However, there is limited work examining how the humans who implement or otherwise interact with algorithms do so in ways that might exacerbate inequality. Using key informant interviews and documentary analysis of court appeals, this paper examines how government personnel implement and adjudicate an algorithm that determines functional eligibility for Medicaid long-term care in Wisconsin. I find that eligibility screeners and appeal judges use discretion in ways that disproportionately benefit Medicaid applicants with greater social and cultural capital. In contrast to most accounts of algorithmic decision-making, the presence of this human bias does not indicate a misuse of the algorithm. Rather, screeners and judges are encouraged to use their discretion, or “professional judgment”, to compensate for the limitations of the algorithm. In focusing on the actors who implement algorithmic tools, this paper shows how the intentional but constrained discretionary power of key personnel can contribute to algorithmic inequality based on less visible dimensions of social and cultural capital.

Author