Individual Submission Summary
Share...

Direct link:

Resisting Algorithms, and Algorithms of Resistance

Fri, November 10, 4:00 to 5:45pm, Hyatt Regency Chicago, Columbian, Concourse Level West Tower

Abstract

Algorithms occupy a privileged position in our current moment. They are used to explain raw data to us, they are used to explain why data cannot be understood and they are used as a form of knowledge about political questions, beings, aspirations, and what slips outside knowability. Other scholars have outlined how algorithmic models fail at the level of the model itself, or how algorithms are wrong. But that argument presupposes that we could indeed work to get algorithms right. Here I propose to do a theoretical analysis of what some recent national election pollsters deemed the “unquantifiable sentiments,” or the catchall phrase, “excess data” of that which cannot be processed normally. Whether it’s trying to fix excess data which cannot be captured within algorithms, or trying to become “excess” and thus unreadable by those algorithms, much of the current narrative about algorithms focuses on legibility and knowability.

What does it mean to be legible or visible to an algorithm? How does one obfuscate that visibility by becoming unreadable or opaque to the algorithm which seeks to make sense of it? There is an incredible archive of work on the theory of what opacity and opaqueness means within our moment, including work by Wendy Chun, Zach Blas, Jacob Gaboury, Edouard Glissant, Liz Losh, and others. Opacity is offered as an alternative to the familiar terrain of identity, visibility, and representation in feminist and queer politics and writing. Through various theoretical approaches to opacity in the face of demands for “visibility politics” and visualization software, the notion of how to become imperceptible in the filter of algorithmic data knowledge is an important way to theorize tactical refusal and subversion. Many of the projects that are often called upon to demonstrate this theory of opacity do understand it as a coalition-building tactic with a social justice arc, as a way to develop a collaborative strategy for opacity in local networks of surveillance. But opacity often also functionally works at the scale of the individual, not the collective.

This paper will collect and analyze scenes of resisting algorithmic tendencies in collective approaches in our political moment, and offer some strategies for how to frame algorithmic knowledge for our students.

Author