Search
Browse By Day
Browse By Time
Browse By Person
Browse By Area
Browse By Session Type
Search Tips
ASC Home
Sign In
X (Twitter)
The use of actuarial risk assessment tools in the criminal justice context continually raise concerns over racial bias. Tools are typically trained to maximize accuracy of prediction as evaluated through metrics like area under the receivership curve (AUC) statistics and confusion matrices. Once developed, tools may then be examined for racial fairness through comparisons of AUC statistics by race group, and through differential prediction regressions which determine whether the tool predicts similarly or differently for different race groups. We demonstrate the utility of fairness regularization models which balance overall prediction and reductions in group-based differences. More specifically we demonstrate a constrained version of the negative Bernoulli log likelihood approach using Constrained Optimization BY Linear Approximations (COBYLA) model. We compare the predictive accuracy and racial differences of fairness regularized models with more traditional risk tools using the same data.