Search
Program Calendar
Browse By Day
Browse By Room
Search Tips
Virtual Exhibit Hall
Personal Schedule
Sign In
Nonconsensual deepfake pornography is the most common (mis)use of deepfake technology, which disproportionally targets women. As stated in the European Union Parliament’s report on deepfake, “at present, the legal roadmap for victims of deepfake pornography often remains unclear.” The study asserts that adding criminal law to this roadmap is preferable and more legitimate than not doing so because nonconsensual deepfake pornography causes serious harm and technical measures and civil remedies fall short of effectively tackling this issue. In this light, the study employs desk research, legal dogmatic method, and fundamental canons of legal interpretation to scrutinize to what extent criminal laws of fourteen European Union members can effectively criminalize dissemination of nonconsensual deepfake pornography. The findings demonstrate that the criminal codes of eight out of fourteen countries do not criminalize the dissemination of nonconsensual deepfake pornography. The upper limits of punishment differ significantly among countries, and some member states employ defamation, voyeurism, the violation of privacy, and the nonconsensual sharing of sensitive personal data to penalize the wrongdoing, which are not ideal instruments for the purpose.