The Bots Boosting QAnon

Jessa Mellea
Foundation for a Human Internet
4 min readDec 22, 2020
Graphic by Samantha Weslin

Maybe you heard about QAnon when those alleging voter fraud started throwing around the Dominion conspiracy theory. Or maybe you heard about it during lockdown protests. But what is QAnon? And how has it proliferated across social media so quickly?

What is QAnon?

QAnon is a baseless conspiracy theory claiming that a Satan-worshipping cabal of blood-drinking pedophiles runs the world. In the mythos of the theory, Donald Trump is portrayed as a messianic figure who has come to save the world from the ‘deep state.’ Followers of the conspiracy theory interpret ‘Q drops,’ or posts from Q, who is a shadowy figure purporting to be a high ranking US government official. The conspiracy theory has spread to over 70 countries and is particularly popular in Germany, Brazil, and the United Kingdom.

Though the theory centers on the supposed sex-trafficking cabal, it encompasses a variety of other conspiracy theories. Many QAnon followers believe that JFK Jr. is still alive, that the furniture company Wayfair is trafficking children in overpriced cabinets, or that Joe Biden’s recent foot injury is simply a cover for an ankle monitor.

While these theories seem (and are) far-fetched and ludicrous, the online movement has had very real effects in the offline world. Followers of Q have doxxed politicians and activists, launching online harassment campaigns that include death threats and publicization of sensitive information, like home addresses and phone numbers. Supporters have been amplifying the debunked Dominion voter fraud conspiracy, spreading disinformation that undermines democracy and faith in elections.

Q believers have even committed acts of violence. The murder of a mob boss and a legal theorist, several kidnappings, and cases of property damage have been linked to individuals with ties to the conspiracy theory.

Source: AP News

How did QAnon spread so quickly?

While QAnon conspiracies are created and spread by real human beings, bots have played a significant role in amplifying the reach of the conspiracy.

A study of over 240 million tweets related to the recent US presidential election found that two QAnon hashtags (“WW1WGA” and “qanon”) were among the top 15 hashtags used in tweets by bot accounts. Tweets that included QAnon and other conspiracy theory-related hashtags were more likely to have come from bots compared to other election-related hashtags. An estimated 13% of the analyzed accounts using a conspiracy-related hashtag were run by bots.

Bots didn’t just amplify QAnon hashtags, though. Bot accounts played a role in spreading news stories by far-right media networks that promote conspiracy narratives, such as OANN or Breitbart. These platforms get a significant boost from bots — over 20% of the accounts that shared their content were run by bots.

Why do these bots matter?

When bots amplify conspiratorial hashtags, they broaden the audience that is susceptible to falling down the rabbit hole of QAnon. Whether or not a Twitter user is drawn into the conspiracy theory by a bot or a real account, the very presence of bots increases the chance that a user will stumble across QAnon. Sharing a story from an established website adds a layer of credibility to the bots’ tweets, increasing the likelihood that users will take QAnon’s claims seriously.

The amplification of QAnon on Twitter artificially inflates mainstream perceptions of the scale of the movement, even outside of Twitter. Being seen as a mass movement, rather than a tiny minority lends legitimacy to theory within the mainstream — 56% of Republicans think that QAnon is mostly or partly true.

So why are these bot armies being deployed to aid QAnon? A recent takedown of over 400 troll accounts operated by Internet Research Agency, a Russian troll farm implicated in election interference in the 2016 election, found that the accounts frequently used #QAnon and related hashtags. This disinformation campaign spread dangerous lies; promoting falsehoods about the dangers of COVID-19, and encouraging distrust in democratic institutions to further Russia’s political interests.

How do we stop bots from amplifying QAnon?

Although deplatforming efforts by Twitter have been somewhat effective in removing the largest QAnon influencers, influencers and bots alike are able to return to the platform fairly quickly. Bot armies are cheap to launch, leaving platforms playing whack-a-mole.

With technology like humanID, platforms can ensure that each user has a single digital account tied to their phone number, while keeping the user anonymous. Thus, bot armies become cost-prohibitive, preventing spamming and disinformation efforts.

However, it’s important to note that the majority of accounts promoting Q are run by real people who believe in the conspiracy theory. QAnon is a symptom of the larger problems of distrust in the media and a climate of uncertainty.

Tech alone won’t be able to deradicalize the thousands of people who believe in QAnon, and it won’t be able to stop the conspiracy theory. But it can prevent it from rapidly proliferating across the web, stopping radicalization before it begins.

What’s humanID?

humanID is a new anonymous online identity that blocks bots and social media manipulation. If you care about privacy and protecting free speech, consider supporting humanID at www.human-id.org, and follow us on Twitter & LinkedIn.

All opinions and views expressed are those of the author, and do not necessarily reflect the position of humanID.

--

--

Jessa Mellea
Foundation for a Human Internet

Brown University 2023 | International Relations and Religious Studies | Research and Marketing @ humanID