Paper Summary
Share...

Direct link:

Gender and Racial Bias in AI-Generated Letters of Recommendation

Sat, April 11, 1:45 to 3:15pm PDT (1:45 to 3:15pm PDT), JW Marriott Los Angeles L.A. LIVE, Floor: Gold Level, Gold 3

Abstract

As Generative AI (GenAI) permeates all aspects of life, including education, research is needed to understand the ways in which GenAI may reproduce societal inequities, as it generates responses based on existing, often biased, bodies of data. This study explores how ChatGPT-generated letters of recommendation for fictionalized students differ based on gendered and/or racialized names and specific qualities typically viewed as masculine (e.g., analytical), feminine (e.g., kind) or a combination. ChatGPT-generated letters were analyzed using gendered and racial dictionaries, thematic coding, and blinded faculty identification of applicant race and gender. Though prompts were identical, letters differed in content based on subtle details, such as last name.

Authors