Search
Program Calendar
Browse By Day
Browse By Room
Search Tips
Virtual Exhibit Hall
Personal Schedule
Sign In
The use of social media platforms is linked to our routine activities, affecting criminal activities. Social media platforms can be considered as cyber places where people converge, including criminals. Legislation, particularly European law, is increasing the responsibilities of social media platforms to face cybercrime. Nonetheless, the preventive powers of social networks are rarely explored by criminology.
This communication argues that content moderation, understood as the private powers of digital platforms to dictate and enforce rules in their spaces, should be understood as a form of prevention and third-party policing. Specifically, using routine activities theory, it postulates that content moderation should be understood as a form of guardianship (place management), testing whether the four place management activities proposed by Madensen (physical space organization, regulation of conduct, access control and resource acquisition) are suited to understand and guide online moderation.
To that end, a sample of very large online platforms (VLOP) and search engines (VLOSE) community rules and user agreements are analyzed, as well as mandatory DSA enforcement reports, matching existing moderation practices to Madensen place management characteristics.