Individual Submission Summary
Share...

Direct link:

Download

Fake News and Social Media: The Communication Strategies of Russian Trolls

Fri, August 31, 2:00 to 3:30pm, Marriott, Vineyard

Abstract

The spread of “fake news” and Russian trolling is thought to have had a decisive impact on the 2016 US campaign. Obama had personally raised the matter of fake news with Facebook’s Mark Zuckerberg and the activities of Russian trolls was identified by the Directorate of National Intelligence as a part of a wider Russian influence campaign. Given the limited resource requirements and minimal consequences for being outed as a purveyor of misinformation on social media, these platforms can facilitate the circulation of misinformation on a scale not previously possible. Given the global nature of these networks, they enable adversaries to carry out foreign influence campaigns or “active measures” without setting foot in the target country. The United States is hardly the only target of such campaigns and Russia is hardly the only perpetrator of active measures which makes understanding how social media platforms are used for foreign influence campaigns of global concern.
The matter of fake news casts the issue as an epistemic problem: claims without any correspondence to an underlying domain of facticity are being widely distributed. This by itself fails to address the motives for their belief and rediffusion. Russian trolls are unlikely to be effective at shaping narratives or moving voters unless their messages persuade in some sense. The content of these communications can be categorized with respect to whether they contain informational content – whether veridical or misinformation – or if they persuade by seeking to shape the manner in which persons identify with other political actors.
This research examines the communication operations and persuasion strategy of Russian Rolls on Twitter during the 2016 general election in the United States. Trolls are identified using a unique dataset of over 2,752 Twitter accounts operating out of the Russia’s “Internet Research Agency” which has been released by US intelligence authorities. These accounts are studied within a dataset of approximately 175 million tweets produced between August 2016 and election day. The research does the following:
1. Characterizes these activities and presentation of these Twitter accounts and identifies the attributes which differentiate them from other actors on Twitter during the campaign.
2. Based on these criteria it estimates the wider population of Russian troll accounts which have not yet been identified.
3. Quantifies the level of “fake news” by comparing tweet content with the content of Politifact’s database on lies during the campaign.
4. Identifies which kinds of messages had the highest level of circulation – misinformation claims (i.e. “fake news”), veridical informational claims, veridical identity claims, unprovable identity claims.
The research provides both practical and theoretical contributions. First, in theoretical terms, the findings will help identify whether and how trolling activities have influence and produce consequences in online communication spaces. Although there is a venerable tradition in political science that voters are moved by informational claims, this has come under challenge recently as some research suggests that political opinions and voting decisions are more influenced by identity considerations. This research contributes to that literature. Second, in practical terms it offers a better understanding of the extent of the problem posed by Russian trolls as well as insights into how they operation which can aid in the development of countermeasures.

Author