Search
Browse By Day
Browse By Time
Browse By Person
Browse By Area
Browse By Session Type
Search Tips
ASC Home
Sign In
X (Twitter)
Image-based sexual exploitation is defined as the act of capturing or disseminating another person’s intimate or nude photo(s) or film(s) without that person’s consent. More recently, the use of artificial intelligence (AI) in the creation of deepfake pornographic videos and images has accelerated these concerns. Due to the relatively recent advancement of image-based sexual abuse through deepfake technology, however, there remains a gap in the literature exploring how the media frames this type of harm. To address this gap in the literature, the current research explores how Canadian news media frames the issue of image-based sexual exploitation through deepfakes to assess public discourse. Drawing from approximately 50 news-media articles,results reveal three dominant narratives, including: “It’s a Technological Issue”, “Who is Accountable”, and “It’s Traumatizing and Damaging”. Factors that explain variations within each theme are discussed as well as how the media largely negates to frame this issue as a form of gender-based violence.