Search
Browse By Day
Browse By Time
Browse By Person
Browse By Area
Browse By Session Type
Search Tips
ASC Home
Sign In
X (Twitter)
Deepfake nudes represent a stark evolution in image-based sexual abuse, driven by the rapid rise and accessibility of generative artificial intelligence (AI). Drawing on survey data from 1,200 young people aged 13-20 from across the United States, we investigated their awareness, perceptions, and experiences with this emergent abuse category. Three key findings emerged: (1) young people overwhelmingly recognize deepfake nudes as a form of technology-facilitated abuse that harms the person depicted, (2) deepfake nudes already represent real experiences that young people have to navigate, and (3) among young people who admit to creating deepfake nudes of others, they describe easy access to the technologies involved. These findings demonstrate urgent implications for prevention and intervention, including aligning societal messaging to acknowledge these harms, establishing comprehensive support protocols in youth-serving organizations, and implementing technical safeguards to prevent technology misuse at scale. Findings, implications, and recommendations will be discussed in greater detail during the panel.