Individual Submission Summary
Share...

Direct link:

How The Criminal Justice System Should Think About Thinking Machines: Avoiding Algorithmic Bias in Criminal Justice

Wed, Nov 15, 3:30 to 4:50pm, Marriott, Room 407, 4th Floor

Abstract

Human decision making is flawed. Evidence suggest that judges impose harsher sentences on less sleep, avoid changing the status quo on low blood sugar, and unduly punish defendants after their football team suffers a loss. Consequently, the promise of dispassionate untiring machines making data-driven decisions seems a welcome innovation. However, such systems are themselves subject to bias. Created by people and trained on historical data, poorly-crafted algorithms can bake in existing biases, obscuring them behind a gloss of mathematics. It is necessary to carefully consider how such models are used and constructed. An error-prone risk model may be welcome when used to help triage scarce social aid but morally unconscionable when used to determine sentencing. This presentation will explore the question of how one should best approach the construction and use of such models, including a focus on understanding what it is a model is optimizing (its definition of success) and how this should constrain its use.

Author