Individual Submission Summary
Share...

Direct link:

Fairness Regularized Risk Assessment Models: Balancing Risk Prediction and Racial and Ethnic Equality

Wed, Nov 12, 2:00 to 3:20pm, Ledroit Park - M3

Abstract

The use of actuarial risk assessment tools in the criminal justice context continually raise concerns over racial bias. Tools are typically trained to maximize accuracy of prediction as evaluated through metrics like area under the receivership curve (AUC) statistics and confusion matrices. Once developed, tools may then be examined for racial fairness through comparisons of AUC statistics by race group, and through differential prediction regressions which determine whether the tool predicts similarly or differently for different race groups. We demonstrate the utility of fairness regularization models which balance overall prediction and reductions in group-based differences. More specifically we demonstrate a constrained version of the negative Bernoulli log likelihood approach using Constrained Optimization BY Linear Approximations (COBYLA) model. We compare the predictive accuracy and racial differences of fairness regularized models with more traditional risk tools using the same data.

Authors