Individual Submission Summary
Share...

Direct link:

Algorithmic Violence

Sat, August 9, 2:00 to 3:00pm, West Tower, Hyatt Regency Chicago, Floor: Ballroom Level/Gold, Regency B

Abstract

Algorithms structure and govern our everyday lives as they are integrated into existing sociotechnical systems and structures. Social structures and institutions can create "structural violence" which prevents human beings from meeting their full potential (Galtung, 1969). Structural violence, unlike direct physical violence, may remain latent and invisible as people accept it as "the way things are," even though it may have severe consequences for their lives (Farmer & Weigel, 2013). This type of violence is built into the structure and affects life chances and society as a whole. In this paper, I show how algorithms can create a new kind of structural violence, which I will refer to as "algorithmic violence."

I illustrate how algorithmic violence develops and manifests on the ground using the cases of algorithmic experiments in the Global South. I develop in-depth case studies of the ways in which algorithmic experiments--both the introduction of new algorithms on digital platforms as well as "tweaks" to existing algorithms--upend efforts at assembly and harm the workings of global civil society. The first case study focuses on Facebook's experiment with News Feed algorithms in 2018, which the social media company rolled out across six countries, including Bolivia, Cambodia, Guatemala, Serbia, Slovakia, and Sri Lanka. The second case traces the development of nearly 100 Cambridge Analytica experiments in developing countries from as early as 1994.

The case studies in this chapter show inadequate responses to structural violence arising from the global configurations surrounding algorithmic experiments. Algorithmic violence stems from inequities created and perpetuated by an assemblage of algorithms and related practices, norms, and institutions. Both scholarship and public conversations have predominantly focused on transparency and accountability of algorithms and big data in the Global North context. I argue that in order to address algorithmic violence, we need structural interventions that are grounded in the lived experiences of marginalized people and communities.

Author