Search
Program Calendar
Browse By Day
Browse By Room
Search Tips
Virtual Exhibit Hall
Personal Schedule
Sign In
The Online Safety Act received Royal Assent in 2023. Through legislation this provides Ofcom, the UK’s communication regulator, to hold user to user services responsible for assessing the harm caused by illegal content, taking effective steps to manage and mitigate identified risks. The proliferation of online child sexual abuse material (CSAM) has in part been mitigated through technological solutions, such as hash matching, but these only go so far. Much of the work continues to be undertaken through human content moderators, individuals tasked with analysing and removing flagged content as offensive or harmful. Based on our work exploring mental health impacts of the moderation of CSAM, this presentation discusses the ramifications of the Online Safety Act and effects of moderation on workers. This work was funded by the Tech Coalition and End Violence, led by Jeffrey DeMarco and Elena Martellozzo and supported by Dr Ruth Spence and Paula Bradbury.