Individual Submission Summary
Share...

Direct link:

Perceptions of sexualized deepfake abuse in the U.S.: The impact of deepfake victims’ gender and race on attributions of blame and harm

Sat, Nov 16, 8:00 to 9:20am, Sierra I - 5th Level

Abstract

Deepfake abuse, or the non-consensual creation and sharing of sexualized and nudified synthetic imagery, has recently grown rampant across the globe, with victims primarily being women and girls. Meanwhile, researchers have only just begun to understand this growing form of gender-based sexual violence. In particular, research has yet to examine perceptions of deepfake abuse in light of participants’ exposure to realistic-looking deepfake images. In this roundtable session, Dr. Asia Eaton will discuss her involvement in a multi-nation, grant-funded research team, including Drs. Asher Flynn (PI), Anastasia Powell, and Adrian Scott, aiming to better understand the nature, prevalence, and public perception of deepfakes. Her contributions in this session will include describing the methods and findings of an experiment investigating perceptions of deepfake abuse using a large and diverse sample of U.S. adults. Specifically, using entirely AI-generated but life-like images of nude targets, she will describe a study in which U.S. participants were randomly exposed to one of six artificial deepfakes in a 2 (victim gender: woman vs. man) x 3 (victim race: white, Black, or East Asian) design. The effect of victim characteristics on participant attributions of victim blame, victim harm, and perpetrator responsibility will be discussed.

Authors