Individual Submission Summary
Share...

Direct link:

Accuracy and Fairness in the Use of Facial Recognition Technology by Law Enforcement

Thu, Nov 14, 9:30 to 10:50am, Sierra D - 5th Level

Abstract

With the increasing prevalence of facial recognition technology (FRT), law enforcement officers and legal practitioners need to understand its accuracy when applied to real-world images, as opposed to the high-quality images typically used for testing. Without this knowledge, they might trust the algorithm more or less than they should. This could lead to misclassifications, and thus miscarriages of justice. Our study aims to evaluate the accuracy and fairness of a specific FRT, offering insights for various stakeholders such as police departments, defense attorneys, judges, researchers, and society at large. Using StyleGAN3, we generate high-quality synthetic faces labeled with FairFace for demographic attributes. Subsequently, we simulate real-world conditions by manipulating illumination, resolution, and pose to create low-quality images. These images are then subjected to facial recognition tasks using Deepface, which incorporates state-of-the-art models employing the ArcFace loss function. Our results show* that the image qualities have a significant impact on the accuracy and fairness of FRT. This emphasizes the need for further research to ensure the appropriate implementation and interpretation of FRT results within the criminal justice system.

*Note: we do not have results yet. This abstract is written with a placeholder, which will be edited when we do have results.

Authors