Browse By Day
Browse By Time
Browse By Person
Browse By Mini-Conference
Browse By Division
Browse By Session or Event Type
There is considerable debate as to whether Russia’s information operations shifted votes during the during the 2016 election. Sides et al. (2018) argue that the Russian efforts on social media likely had little consequence on the outcome of the election, citing the fact that Russia’s intervention amounted to an insignificant messaging effort amidst the hundreds of millions of social media messages and informational cues received by voters. Further, research shows that any campaign messages have limited duration and impact on recipients (Kalla and Broockman 2018; Nyhan 2018) and that domestic media outlets reached a wider audience than Russian operatives (Benkler 2018). Against these positions, former members of the intelligence community (Clapper 2018; Hayden 2018; Watts 2017) note that the hack of emails from the Democratic National Committee and Clinton’s campaign chair played a critical role in supporting key narratives during the 2016 election – and that candidate Trump amplified these narratives. Further, Jamieson (2018) notes that perhaps one of the most critical aspects of the Russian interference was the suppression of the African American vote which was a central theme in their Facebook strategy. This research examines the extent to which Russian information operations may have created enduring attitude shifts through the repetition of content and the role they may have played in shifting the content of the wider media ecosystem.
Research into the effects of Russian information operations have focused on the effects of these communications based on their reach and the engagements with their communications but such dyadic and network-based approaches miss the role of information operations in reshaping the communication situation through the weaponization and amplification of narratives. This research looks more broadly at these questions by considering the role of Russian covert influence agents in selecting and amplifying narratives. Specific theoretical questions this research addresses:
Role of elites and media in shaping political discourse. A long tradition of research in political science suggests that public opinion is elite driven and that elite cues are often transmitted through media outlets (Converse 1962; Zaller 1992). If that is so, a prior question is, who influences the influencers? This research examines whether and under what conditions Russian “troll” or “sockpuppet” accounts “influenced the influencers” in media and politics, advancing populist challenges to authority and expertise.
Environmental influences communication acceptance. Previous work on the diffusion of Russian influence operations have tended to focus on the transmission and receipt of communications on a single platform. This research models the flow of Russian information operations across multiple platforms and thus contributes to theories of intermedia agenda-setting. In addition, it examines the role of repetition in increasing the probability of acceptance. The results will determine whether Russian accounts are primarily amplifying existing media narratives or if they introduce new topics.
The data analysis is based on over 200 million tweets, Posts and comments on Clinton and Trump’s official Facebook pages, and candidate speeches during the campaign. Broadcast media are defined in terms of the major newspapers and television news transcripts contained in the Nexis library. In order to address the scientific questions that we mentioned previously, we apply a large variety of techniques from social network analysis, text mining and causality inference as well which have been developed and applied in computer science by two of the authors. Specifically: We use social network analysis community detection, influential entities identification (social elites) and estimation of the true source of influence (Kitsak et al. 2010; Salamanos et al. 2017; Vosoughi et al. 2018). This will identify both discourse communities and the centrality of Russian trolls in the network. We will be able to answer the following about Russian troll accounts: Are they strongly connected with legitimate influential/elite users? Do they follow a strategy when they are forming their network affiliations? In addition, based on the time delay of the transmitted tweets, we backtrack the history of fake news spreading and we identify the source of diffusion in Twitter i.e. the true source of the troll tweets (see the method of Shen et al. 2016). Furthermore, we use sentiment analysis in order to estimate the main factors that cause some fake tweets to become viral. Finally, we apply the Hawkes process statistical model (causality inference model) in order to estimate the interplay between the social platforms. Hawkes process has been extensively used in the past with very good results (Zannettou et al. 2018).