Information Wars: A Window into the Alternative Media Ecosystem

Kate Starbird
HCI & Design at UW
Published in
19 min readMar 15, 2017

Conspiracy Theories, Muddled Thinking, and Political Disinformation

Background: Examining “Alternative Narratives” of Crisis Events

For more than three years, my lab at the University of Washington has conducted research looking at how people spread rumors online during crisis events. We have looked at natural disasters like earthquakes and hurricanes as well as man-made events such as mass shootings and terrorist attacks. Due to the public availability of data, we focused primarily on Twitter — but we also used data collected there (tweets) to expose broader activity in the surrounding media ecosystem.

Over time, we noted that a similar kind of rumor kept showing up, over and over again, after each of the man-made crisis events — a conspiracy theory or “alternative narrative” of the event that claimed it either didn’t happen or that it was perpetrated by someone other than the current suspects.

We first encountered this type of rumor while studying the Boston Marathon bombings in 2013. We noticed a large number of tweets (>4000) claiming that the bombings were a “false flag” perpetrated by U.S. Navy Seals. The initial spread of this rumor involved a “cascade” of tweets linking to an article on the InfoWars website. At the time, our researchers did not know what InfoWars was, but the significance of that connection became clear over time.

In subsequent crisis events, similar rumors appeared. After the Umpqua Community College shooting, a rumor claimed the event was staged by “crisis actors” for political reasons — specifically to justify legal restrictions on gun rights. And after the shootings at the Orlando Pulse nightclub, a rumor suggested they were committed by someone other than the accused gunman — with the purpose of falsely blaming the attack on Muslims. For every man-made crisis event we studied, we found evidence of alternative narratives, often shared by some of the same accounts and connected to some of the same online sites.

These rumors had different “signatures” from other types of rumors. In terms of volume (measured in tweets per minute), most crisis-related rumors spike quickly and then fade out relatively quickly as well, typically “decaying” at an exponential rate. But these alternative narrative rumors rose more slowly, and then they lingered, ebbing and flowing over the course of days or weeks (or years). They also had sustained participation by a set group of Twitter users (i.e. many tweets per user over an extended period of time), rather than finite participation by a large number of users (one or two tweets per user, all at around the same time) as typical rumors do. Additionally, alternative narrative rumors often had high “domain diversity”, in that tweets referencing the rumors linked to a large number of distinct domains (different websites), including alternative media sites such as InfoWars, BeforeItsNews, and RT (aka Russia Today). Several of these rumors also had a strong “botnet” presence — in other words, many participating Twitter accounts were not “real” people, but were operated by a computer program that controlled a large number of accounts.

In our very first study (about the 2013 Boston Marathon Bombings) we noted that alternative narrative rumors intersected with politicized content. Analysis of co-occurring hashtags showed that #falseflag often appeared in the same tweets as #obama, #nra, #teaparty, #tcot, #tlot, #p2. As a researcher of crisis informatics, I’ve often noted how crises become politicized in online spaces (and elsewhere), but this was different, as the false flag rumor appeared to be deeply connected to political themes and propagated for a distinctly political purpose.

Strange Commonalities and Connections: Why We Shifted Focus
Initially, we chose not to dwell on these types of rumors, thinking that they had little impact on our core research questions — how people respond to crisis events and how we could make the information space more useful for crisis-affected people by detecting false rumors. These alternative narrative rumors rarely resonated within crisis-affected populations. And so, though we often remarked upon them when they surfaced in our data, we maintained our research focus elsewhere.

However, in early 2016, in the wake of the Umpqua Community College shootings and the coordinated terror attacks in Paris, a few of my students decided to take a closer look at what they perceived to be commonalities in the alternative narratives spreading on Twitter about the two different events — as well as what they thought to be a botnet driving a large portion of that content.

[Both of these hunches turned out to be true. The botnet was connected to “the Real Strategy” or TheRealStrategy.com. They coordinated hundreds of accounts that tweeted content related to several different alternative narratives from these events and others. Though some of those accounts have been deleted, others are still operational, new ones have been created, and they continue to publish and tweet out content related to numerous conspiracy theories.]

Using Twitter data collected during these events, the students built network graphs that revealed connections between different Twitter accounts — and between different “communities” of accounts — participating in these alternative narratives. When we went to examine the data in Winter 2016, we were extremely confused by some of the intersections. Why were a handful of “Anonymous” accounts and GamerGaters connected with Pro-Palestinian accounts on one side and European white nationalists on another? Why were seemingly left-wing supporters of Wikileaks connecting with seemingly right-wing supporters of Donald Trump? And why did these groups come together to talk about alternative narratives of mass shooting events? It didn’t make sense. Yet.

A Systematic Exploration of the Alternative Media Ecosystem through the Lens of Alternative Narratives of Mass Shooting Events
Almost a year later, motivated by the political disruptions of 2016, the rhetoric around “fake news” and alternative media, and this nagging feeling that there was something in our online rumoring data that could provide insight into these issues, we completed a systematic study of alternative narratives of mass shooting events, looking specifically at the alternative media ecosystem that generates them and supports their spread. A first paper resulting from this work was recently reviewed and accepted to the ICWSM 2017 conference. I have uploaded a pre-print version of this paper to my website.

In the remainder of this blog, I am going to describe some of that research, including the methods and the main findings. These findings touch on the nature of alternative media, including the presence of (and connections between) conspiracy theories, political propaganda, and disinformation.

Methods of Data Collection and Analysis

On January 1, 2016, our lab launched a Twitter collection focused specifically on shooting events. We kept this collection going for more than nine months, until October 6, tracking on (English) terms including shooting, shootings, gunman, and gunmen. From this collection, we then identified tweets that referenced alternative narratives — i.e. tweets that also contained terms such as “false flag”, “hoax”, and “crisis actor”.

Next, we created a network map of the Internet domains referenced in these tweets. In other words, we wanted to see what websites people cited as they talked about and constructed these alternative narratives, as well as how those different websites were connected. To do that, we generated a graph where nodes were Internet domains (extracted from URL links in the tweets). In this graph, nodes are sized by the overall number of tweets that linked to that domain and an edge exists between two nodes if the same Twitter account posted one tweet citing one domain and another tweet citing the other. After some trimming (removing domains such as social media sites and URL shorteners that are connected to everything), we ended up with the graph you see in Figure 1. We then used the graph to explore the media ecosystem through which the production of alternative narratives takes place.

Figure 1. Domain Network Graph, Colored by Media Type
Purple = mainstream media; Aqua = alternative media;
Red = government controlled media

After generating the graph, we conducted an in-depth qualitative analysis of all of the domains in the graph — reading their home and About pages, identifying prominent themes in their current website, searching for specific themes within their historical content, examining other available information (online) about their owners and writers, etc. Below, I discuss what we learned about this alternative media ecosystem through this analysis.

Alternative Media Were Cited for Supporting Alternative Narratives; Mainstream Media Were Cited for Challenging Them

The network graph represents a subsection of the larger media ecosystem — it is a snapshot of the “structure” of the conversation around alternative narratives. After trimming to domains cited multiple times (and by multiple people), the graph contains 117 total domains. We determined 80 of these to belong to “alternative media” (Figure 1, colored Aqua) and 27 to belong to mainstream media (Figure 1, colored Purple). Other domains include three belonging to NGOs and two belonging to media outlets funded by the Russian government (RT.com and SputnikNews.com).

It’s important to note that not all of these domains contained content promoting alternative narratives of shooting events. In the Twitter conversations about these alternative narratives, domains were cited in different ways for different kinds of content.

More than half of the domains in the graph (and more than 80% of the alternative media domains) were cited for content explicitly supporting the alternative narratives. However, others (especially mainstream media) were cited for factual accounts of the events, and then used as evidence by conspiracy theorists as they built these theories. And a few were referenced for their denials of these theories. Below are examples of each, to give you a sense of how tweets referenced external domains.

Supporting: The tweet below links to an article in the WorldTruth.tv domain which claims that witness accounts of multiple gunmen (which conflict with the official account) suggest that the Orlando Pulse nightclub shooting is some sort of false flag. Contradictory and dynamic information — typical of the fog-of-war type situations that occur after crisis events — is often used as “evidence” to support alternative narratives of these events.

As Evidence: The tweet below claims that one of the witnesses to the Orlando shooting is an actor and that the shootings were a false flag. This echoes a common theme, which appears across many alternative narratives in our research, that “crisis actors” are used to stage events. The tweet links to an article in the Toronto Star domain which contains a neutral, factual account of the event.

Denying: This tweet links to the New York Times domain — to an article that refutes several different alternative narratives of the Orlando shootings. However, instead of aligning with the arguments in that article, this tweet is accusing the New York Times of being a participant in the conspiracy/hoax/false flag.

[Following Twitter’s rules, I am only providing examples here of tweets that are still publicly available on Twitter. I have also attempted to choose accounts for these examples that seem to intentionally propagate alternative narratives — in other words, I am attempting to avoid calling out individuals/accounts that might be uncomfortable being associated with these ideas.]

Most of the domains cited in the production of alternative narratives were “alternative media” domains, and most of these (68 of 80) were cited (linked-to) in the tweets we collected for content that explicitly supported alternative narratives. As you can see in the graph (Figure 1), the alternative media ecosystem is tightly connected — i.e. the Twitter users who produce alternative narratives often cite several different alternative media domains in their conspiracy theory tweets. The three main hubs in this particular network are VeteransToday.com, BeforeItsNews.com, and NoDisinfo.com, but there are many other alternative media domains that play a significant role in the production of alternative narratives. This alternative media ecosystem (a subset of the larger graph) is the focus of the remainder of this blog.

However, I want to explicitly note and clarify one aspect of the graph: though mainstream media domains like the Washington Post, the New York Times, and Fox News appear in the graph, no mainstream media account in this graph hosted any content promoting the alternative narratives we were studying. Instead, they were typically cited in our Twitter data for general content about the event that was later used as “evidence” of a conspiracy. Mainstream media were also cited for corrections of the alternative narratives (sometimes in tweets supporting those corrections, sometimes in tweets contesting them). In the case of the New York Times, the newspaper posted an article explicitly denying alternative narratives of the Orlando shooting event. This denial was then cited several times by those promoting those narratives — as even more evidence for their theory. [This demonstrates a vexing aspect of rumor-correcting in this context — that corrections often backfire.]

The network graph does reveal some mainstream media sites to be more integrated into the alternative media ecosystem. For example, several people who tweet links to VeteransToday.com also tweet links to FoxNews.com, pulling it closer into that part of the graph.

The Role of Botnets in Amplifying Alternative Narratives

These data also provide insight into the effect of automated accounts (botnets) on the data. For example, the most tweeted domain in our data was TheRealStrategy.com. It was tweeted so many times (7436) and connected to so many domains (relative to all other domains) that we had to remove it from the graph. [It was the only highly cited, highly connected media domain we removed.] Examining the temporal patterns (tweets over time) suggests that almost all of the tweets that linked-to this domain were generated by a computer program. That program operated hundreds of different accounts, directing them to tweet out in regular bursts (dozens at the same time). Most often, these tweets linked to TheRealStrategy, but the program also sprinkled in tweets linking to other alternative media domains. Closer analysis revealed many of these Twitter accounts to have similar profile descriptions and to use photos stolen from other people online. This is a very sophisticated botnet that seems to be effectively bringing “real” accounts into its friend/following networks — and primarily propagating conspiracy theories and politicized content.

The InfoWars site was the second-most highly tweeted in our data set (1742 times). Almost all of the tweet activity citing InfoWars came from a coordinated set of accounts — all were similarly named and each sent a single tweet linking to one of two InfoWars articles about different alternative narratives of different shooting events. All of these accounts are now suspended. Though not as sophisticated as TheRealStrategy, this botnet did amplify the content of InfoWars, which was occasionally picked up and retweeted by others.

Political Propaganda: Nationalism vs. Globalism

One of the first things that struck us as we conducted qualitative content analysis on the alternative media domains was the amount of political content on the websites. We attempted to characterize this content, going through several rounds of iteration to try to recognize patterns across the sites and distinguish between different political orientations.

It quickly became clear that the U.S. left (liberal) vs. right (conservative) political spectrum was not appropriate for much of this content. Instead, the major political orientation was towards anti-globalism. Almost always, this orientation was made explicit in the content.

The meaning of globalism varied across the sites. For some websites focused on a U.S. audience, globalism implied a pro-immigrant stance. For more internationally-focused sites, globalism was used to characterize (and criticize) the influence of the U.S. government in other parts of the world. In some of the more conspiracy-focused sites, the term was used to suggest connections to a global conspiracy by rich, powerful people who manipulated the world for their benefit. Globalism was also tied to corporatism — in other words, the ways in which large, multi-national companies exert power over the world. And the term was also connected, implicitly and explicitly, to mainstream media.

In this way, to be anti-globalist could include being anti-mainstream media, anti-immigration, anti-corporation, anti-U.S. government, and anti-European Union. Due to the range of different meanings employed, the sentiment of anti-globalism pulled together individuals (and ideologies) from both the right and the left of the U.S. political spectrum. Disturbingly, much of the anti-globalist content in these alternative media domains was also anti-Semitic — echoing long-lived conspiracy theories about powerful Jewish people controlling world events.

So Many Conspiracy Theories: Crippled Epistemologies, Muddled Thinking, and the Fingerprints of a Disinformation Campaign

Another thing we noticed was both a proliferation and a convergence of different conspiratorial themes. Every domain that hosted an article promoting an alternative narrative of a shooting event also contained content referencing other conspiracy theories — sometimes hundreds of them. They were not all political in nature. We also encountered pseudo-science theories about vaccines, GMOs, and “chemtrails”. Some domains were all about conspiracy theories, but others featured seemingly normal news with conspiracy theories sprinkled in. Through qualitative analysis, we determined 24 alternative media domains to be primarily focused on distributing conspiracy theories and 44 to be primarily focused on communicating a political agenda.

Though there were many different theories spreading through this information ecosystem, we also saw a convergence of themes — some of the same stories appeared on several different domains. Occasionally, the stories seemed largely independent (i.e. different perspectives, different evidence), but often they were essentially copied from one site to another, or a downstream story simply synthesized an article on another site, including lengthy excerpts from the original. Additionally, a few authors seemed to contribute stories to multiple domains in the network.

So, a person seeking information within this ecosystem might encounter an article from one website that synthesized an article from a second website that was originally posted on and copied from a third website. One effect of this is that people seeking information within this space may think they are getting information from many different sources when in fact they are getting information from the same or very similar sources, laundered through many different websites. Sunstein & Vermeule (2009) argue that conspiratorial thinking is related to a “crippled epistemology” and that a significant component of this is a limited and/or slanted information diet (for example, one shaped by a social group). Our research suggests the information dynamics of this alternative media ecosystem, how the same information exists in different forms in different places, may create a false perception of information diversity or triangulation — further complicating this issue of crippled epistemologies.

From another perspective, these properties of the alternative news ecosystem — the proliferation of many and even conflicting conspiracy theories and the deceptive appearance of source diversity — may reflect the intentional use of disinformation tactics. Though we often think of disinformation as being employed to convince us of a specific ideology, in a 2014 article titled “The Menace of Unreality”, Pomerantsev and Weiss describe how Russian disinformation strategies (which they trace back to Lenin) are designed not to convince but to confuse, to create “muddled thinking” within in society. Their strategic argument is that a society who learns it cannot trust information can be easily controlled. It is possible that the current media ecosystem — including the alternative media domains and the social media platforms that help spread links to these domains — is contributing to muddled thinking (a relative or effect perhaps of an crippled epistemology). It is not yet clear if these effects are related to purposeful disinformation campaigns or are just emergent effects of our current information space. It seems researchers have some work to do to both clarify what is happening here and to perhaps think about designing systems that are more resilient to disinformation.

Alternative Media Co-opt Critical Thinking, Facts, and Truth

Perhaps the most vexing finding that emerged from this analysis — especially as we attempt to think of how to help people become better consumers of online information — was what we perceived to be an intentional strategy by many alternative media websites to leverage rhetoric around fake news and critical thinking to further confuse and mislead readers.

Our research shows that rejection of mainstream news is a common theme across alternative media domains. Perhaps it’s a truism to say that alternative media exist in juxtaposition to mainstream media, but what is interesting here is that many alternative media sites have explicitly set themselves up as opposition to mainstream, “corporate” media. They have also seized upon claims of political bias in mainstream media (towards liberal or pro-Western ideologies) and have leveraged those to support their own legitimacy.

Additionally, it seems they have co-opted arguments about media literacy (boyd makes this same argument) and critical thinking. The conversation around “fake news” often ends with statements about teaching people to become better consumers of information — to be skeptical as they educate themselves through encounters with online media. Alternative news sites have appropriated these arguments and are using them to support the propagation of alternative narratives and other conspiracy theories.

Consider the text below, an excerpt from the About page of the 21stCenturyWire.com domain:

21stCenturyWire.com is a typical domain in our network graph, positioned in the upper left corner (of Figure 1) and strongly connected to both NoDisinfo and VeteransToday (which both spread strong anti-Semitic content). 59 tweets in our collection linked to this domain, referencing multiple articles explicitly supporting alternative narratives about several mass shootings, including claims that both the Dallas police shootings and the Orlando nightclub shootings were staged events. However, the conspiratorial focus of this domain extended far beyond alternative narratives of shootings. Domain content supported a wide range of conspiratorial themes, with articles promoting claims about vaccines causing autism, government-engineered weather events, George Soros-backed anti-Trump protests, and pedophile rings operated by powerful people. Through our analysis of domain content, we also determined 21stCenturyWire to be strongly supportive of Russian political interests (another prominent theme in our data).

The domain is owned and operated by Patrick Henningsen, a journalist who has worked for RT news, Guardian.co.uk, GlobalResearch.ca, and Infowars.com. Perhaps not surprisingly, all of these domains are nodes in our graph.

Examining the About page of 21stCenturyWire, you can see how the site leverages the (somewhat techno-utopian) rhetoric of freedom of information and citizen-journalism — explicitly encouraging readers to use their own “critical thinking” skills while implicitly complimenting them on those skills and perhaps activating a sense of confidence in their abilities. You can handle this. We’ll give you the facts and you can decide for yourself! The site also claims to be outside both corporate and government control. The first claim represents a somewhat natural counter-positioning — i.e. alternative media against corporate-controlled mainstream media. But the second claim is somewhat disingenuous, as the domain often hosts content that is cross-posted to RT — formerly Russia Today, a media outlet funded and largely controlled by the Russian government.

This kind of positioning of alternative media was typical for the domains we examined. Below is another example, this one from the Purpose & Goals page of the NoDisinfo.com domain:

Notice the language emphasizing how this website provides “facts”. It allows people to “make up their own minds”. Its purpose is to unravel “deception and disinformation”. This framing is likely very intentional, claiming to be presenting unadulterated “truth” and empowering users to perhaps feel that they are discovering that truth within this domain. And users can find all kinds of truth (in the form of conspiracy theories) here — from 9–11 trutherism to claims about possibly apocalyptic effects of the Fukishima nuclear disaster being purposefully obscured by mainstream media.

Summary and Conclusion

This research attempted to take a systematic approach to unpacking the alternative media ecosystem. We focused on “alternative narratives” of crisis events and utilized Twitter data to map the structure of the alternative media ecosystem that drives these narratives. Through content analysis, we found these domains to collectively host many different types of conspiracy theories — from politically-themed narratives about the “New World Order” to anti-vaccine arguments. In this “virtual” world, the Sandy Hook School shootings were staged by crisis actors and the earth is actually flat after all.

We determined a large portion of the content on this network to be political propaganda. For the most part, this political propaganda was focused around “anti-globalism”. This term was used to designate different things in different domains (and even in different articles within the same domains) — e.g. anti-immigration, anti-Western imperialism, anti-corporation, anti-media. Disturbingly, there were also strong currents of antisemitism (sometimes explicit, sometimes less so) across a subsection of this ecosystem. Taken together, these positions seem aligned with and used in support of the rise of nationalist ideologies in the U.S. and elsewhere.

We also noted how the structure of the alternative media ecosystem and the content that is hosted and spread there suggest the use of intentional disinformation tactics — meant to create “muddled thinking” and a general mistrust in information.

Because the underlying data in this analysis are limited (to tweets about shooting events), future work will be needed to A) assess the broader alternative media ecosystem (our data limited us to a very specific view); and B) determine how influential these media and their messages are on U.S. and global perspectives of world events and science. However, it is clear that information shared within this seemingly fringe information ecosystem is entering the public sphere at large.

When we conducted this analysis in December, many of these alternative news domains were beginning to appropriate the term “fake news” to deflect attacks back onto the mainstream media. Weeks later, newly inaugurated U.S. President Trump echoed this refrain, publicly stating (even tweeting) that various mainstream media outlets and particular stories were “fake news”. Other information trajectories from alternative media websites to public statements by the Trump administration have been identified (e.g. the recent wiretapping claims), and though this does not imply causation, it does indicate a connection between the alternative media ecosystem and the U.S. President. The addition of Steve Bannon to Trump’s inner circle underscores this connection as well. Before his appointment to Trump’s campaign, Bannon ran Breitbart news, an alternative media website that appears in our data — and one that we determined to have a strong anti-globalist perspective. Indeed, Bannon’s recent comments at the Republican CPAC meeting make this ideological orientation explicit.

While criticizing the mainstream media, Bannon said this: “They’re corporatist, globalist media that are adamantly opposed to an economic nationalist agenda like Donald Trump has.”

This comment summarizes a great deal of the research we did, demonstrating how criticism of mainstream media (practically etched into the DNA of alternative media) is aligned with a political agenda of anti-globalism in favor of nationalism, and how that agenda is connected to the political orientations and goals of the Trump administration. Perhaps the main contribution of our research is merely to point out that these ideologies are spread within an alternative media ecosystem that utilizes conspiracy theories like Sandy Hook hoax claims and old anti-Semitic narratives to attract readers and support this spread. And that these alternative media websites aren’t focused solely on U.S. far-right or alt-right content, but are also using alt-left content to pull readers into this information ecosystem and the ideologies spreading there.

Most importantly, this work suggests that Alex Jones is indeed a prophet. Seriously, as I read through dozens of these alternative media websites and dug DEEP into their content, I realized that there is an indeed an information war being waged. Three years ago, our lab decided these conspiracy theories were too marginal and salacious to be the focus of our research. Almost that it was beneath our dignity to pay attention to and promote this kind of content. What a terrible mistake that was. It seems to me that we weren’t the only ones who made it. It is (past) time we attend to this (as researchers and designers of the systems that conduct this content). I hope it is not too late.

[Here is a list of the domains that appear in our network graph. Please note that the qualitative coding was done through iterative, interpretive content analysis. It is possible that others may perceive that a different determination (or set of categories) would be better for some of these domains. Please let me know if you feel that there is a systematic coding error or unrecognized pattern in the data, as this work is ongoing and I’d love to be able to incorporate your insights. Thank you.]

--

--

Kate Starbird
HCI & Design at UW

Associate Professor of Human Centered Design & Engineering at UW. Researcher of crisis informatics and online rumors. Aging athlete. Army brat.