Search
Program Calendar
Browse By Day
Browse By Time
Browse By Person
Browse By Session Type
Personal Schedule
Sign In
Access for All
Exhibit Hall
Hotels
WiFi
Search Tips
Mandatory transparency regimes are often assumed to reveal how platforms govern content. This paper shows instead that they produce standardized, auditable accounts shaped by regulatory schemas and organizational disclosure choices. Analyzing 3.37 billion Statements of Reasons from the EU Digital Services Act Transparency Database (Facebook, Instagram, TikTok, Reddit), I make three contributions. First, variance decomposition demonstrates that decision-stage automation is a stable platform-level trait, not a response to content type or time, yielding three governance postures: human-in-the-loop review, algorithmic decision-making, and community moderation. Second, I show that accountability decomposes into two distinct components—schema interface and categorical disclosure—with divergent stability and regulatory responsiveness. Third, I demonstrate that cross-platform enforcement comparisons depend on what the reporting schema can encode, and apparent severity differences can be artifacts of legibility rather than governance. Together, these findings reframe mandatory transparency as an enacted performance layer shaping what governance becomes visible and comparable.