Individual Submission Summary
Share...

Direct link:

Algorithmic Authority and Epistemic Asymmetry: A Du Boisian Theory of AI

Sat, August 8, 2:00 to 3:00pm, TBA

Abstract

Artificial intelligence (AI) is frequently framed as an informational technology that generates text, audio, and video through algorithmic prediction, whose social harms can be mitigated through fairness constraints and bias correction. This output-centered framing assumes that inequality in AI originates from distorted results within neutral systems. Such an approach, however, overlooks a more fundamental issue: the knowledge on which AI systems are trained is produced within historically structured relations of power.

I define this unequal distribution of interpretive authority as epistemic asymmetry: who is authorized to produce knowledge, what counts as legitimate knowledge, and whose reasons are objective and scientific. If AI systems are trained on knowledge generated within the historical conditions of epistemic asymmetry, then concerns extend beyond biased outputs to the organization of epistemic authority itself. AI becomes an epistemic institution in which historically structured epistemic authority is reorganized and amplified.

To theorize how power structures knowledge, I recenter W. E. B. Du Bois as a foundational analyst of epistemic asymmetry. Du Bois’s concepts of the veil, double consciousness, and the color line reveal how racial domination structures the production and validation of knowledge. Moving beyond race, AI marks the computational reconstitution of historically dominant knowledge. AI systems are trained on historically skewed data generated within unequal social relations and therefore encode existing epistemic asymmetry. Through processes of data labeling, classification, and optimization, these asymmetries are formalized within model architectures. When AI-generated predictions guide decision-making in concrete domains such as labor markets, healthcare, policing, and migration governance, algorithmic authority shapes the distribution of opportunities and constraints. From this perspective, I argue that AI is the recursive institutionalization of epistemic asymmetry and the computational continuation of historically structured knowledge–power relations.

Author