Why the debunked COVID-19 conspiracy video “Plandemic” won’t go away

Major platforms are removing the discredited conspiracy video, but niche conspiracy communities and “alt-tech” start-ups are rapidly fueling its regeneration

@DFRLab
DFRLab
9 min readMay 14, 2020

--

(Source: @zkharazian/DFRLab)

Despite efforts by major platforms to limit its spread, copies of the widely debunked conspiracy video “Plandemic” continued to multiply and spread largely through niche online conspiracy communities. Once Facebook, Twitter, and YouTube began proactively removing the video, users from groups like QAnon promoted it, hosting the video on “alt-tech” platforms that cast themselves as “pro-free speech” options, while continuing to share links to the video on Facebook and Twitter at a rapid rate.

The discredited video featuring Judy Mikovits, a former research scientist with a record of promoting scientific falsehoods and engaging in scientific misconduct, has had widespread appeal across various niche conspiracy communities online. It is a particularly dangerous instance of health misinformation not only because of the claims it makes, but because of the format it takes: a slickly produced interview that attempts to cast Mikovits as a credible figure despite her controversial reputation.

The video’s enduring online presence in spite of major platforms’ efforts to limit its spread demonstrates that the removal of the offending content does not prevent the spread of a conspiracy once it has gone viral. In the case of “Plandemic,” the removal of the video seems to have triggered a form of the Streisand Effect, in which attempts to suppress online content paradoxically fuel greater interest and seeking out of that content.

The discovery that the “Plandemic” video has migrated to various “alt-tech” video sharing sites, at times in anticipation of future removals, underscores the limits of content moderation on individual platforms. Faced with removal by major platforms, harmful content usually moves to find niche refuges on the internet in order to meet demand.

The role of high-follower QAnon accounts on Twitter

A search on social media listening tool Meltwater Explore for mentions of the topic between May 4 and May 10, 2020 returned roughly 148,000 tweets, with a peak in volume on May 6.

Mention volume for the Plandemic conspiracy from May 4, 2020 — May 10, 2020. (Source: @zkharazian/DFRLab via Meltwater Explore)

Using the rtweet R package, the DFRLab collected 10,000 tweets that used the hashtag #plandemic for a network analysis. The data for the network analysis was pulled on May 7. As such, it should not be viewed as representative of all conversations on Twitter regarding the “Plandemic” conspiracy; but rather, a snapshot of a portion of the activity at a moment when the topic had just crested its peak. Even analysis of this limited dataset revealed discernible and consistent trends when compared with the overall traffic flow for the topic obtained through Meltwater Explore. Importantly, many of the accounts identified as key hubs for interactions related to the conspiracy in the sample also appeared as the most retweeted accounts in Meltwater.

The data was visualized as an actor network using the vsonSML R package, a suite of tools for social media network analysis, and Gephi, a social network visualization software. After processing, the resulting graph file had 8,910 nodes and 8,826 edges. Each node represents an individual Twitter account, and the connections joining them — the edges — represent interactions between accounts in the forms of retweets, quote tweets, replies, or mentions. Nodes were sized by in-degree, which is a measure of how many “inlinks” the account received from other accounts — in simple terms, how many other accounts retweeted, mentioned, replied to, or quote-tweeted the source account.

Conversations involving the conspiracy were clustered into “communities” arranged around a key discussion hub or influencer account that received high engagement from the other accounts in the cluster. Most of these communities consisted of accounts actively promoting QAnon conspiracies. Thus, while the accounts involved seemed to share a common interest in QAnon, the manner in which they engaged with one another about the conspiracy was highly multipolar — there was no central discussion hub with which the accounts interacted.

This may be a feature of QAnon Twitter networks. Previous analysis by researcher Erin Gallagher described QAnon networks as “dense and frenetic,” and noted how these accounts are known to congregate in “Twitter rooms,” where they coordinate retweets of each other’s posts to artificially increase engagement.

A Twitter actor network of the hashtag #Plandemic. (Source: @zkharazian/DFRLab via vosonSML R package, clockwise from top: @gbroh10/archive; @sheepknowmore/archive; @cjtruth/archive; @99freemind/archive; @QTheWakeUp/archive; @PlannedemicT/archive; @neruaelle/archive; @donnawr8/archive)

Even though much of the conversation occurred in relatively isolated community clusters, many of the accounts at the center of each cluster prominently displayed QAnon hashtags, such as #WWG1WGA, and other QAnon references in their bios:

High-follower QAnon accounts were at the center of the community clusters discussing the Plandemic conspiracy on Twitter. (Source: @donnaWR8/archive, top left; @QTheWakeUp/archive, top right; @cjtruth/archive, bottom left; @99freemind/archive, bottom right)

The Meltwater Explore analysis corroborated this finding, identifying some of the same QAnon accounts as authoring the most retweeted posts pushing the conspiracy:

Most retweeted posts promoting the conspiracy on Twitter came from several of the high-follower QAnon Twitter accounts also identified in the DFRLab’s network analysis as central influencers. (Source: @zkharazian/DFRLab via Meltwater Explore)

A few accounts at the heart of their own major community cluster in the sample were devoted to debunking the conspiracy. These included the Twitter account of Renee Di Resta (@noUpside), a disinformation researcher with the Stanford Internet Observatory who has been prolific in research devoted to understanding “anti-vaxx” and “anti-science” conspiracies online; the account of David Gorski (@Gorskon) the editor of Science-Based Medicine, a blog devoted to covering medical scams, controversies, and pseudoscientific claims; and @RachelAlter007, whose account is unverified, but largely devoted to emphasizing the safety and efficacy of vaccines. YouTube’s official Twitter account also formed a cluster, largely because it was flooded with interactions, first from users who posted the YouTube-hosted copy of the video to Twitter, and later from those enraged at the company’s decision to remove the video from its platform.

To better understand the topics of these conversations on Twitter, the DFRLab constructed what is known as a bigram word network of commonly co-occurring pairs of words. The visualization below shows the word pairs that appeared alongside each other more than 200 times in the collected tweets containing the #Plandemic hashtag.

Prominent word paths appearing in the corpus of tweet texts contained phrases such as “youtube censored doctors,” “youtube deleting multiple copies,” and “molecular biologist breaks silence.”

Bigram count word network (stops words such as “and,” “or,” “I” filtered out) of #Plandemic on Twitter, with common linguistic word paths indicating discussion of “censorship” or “silencing” marked in red. (Source: zkharazian/DFRLab)

A significant driver of discussion on Twitter concerned YouTube’s and other major platforms’ decision to remove, and thus “censor,” the video, as is shown in both the highlighted paths in the word network above and the inset tweets in the earlier actor network. Users reposted the video, urging others to watch before it got removed again and often directing to external domains hosting the video.

Rapid link sharing of the conspiracy video on Facebook

On Facebook, the DFRLab found dozens of Facebook groups whose members were engaging in sustained, rapid link sharing related to the conspiracy. Using the CooRNet R package and CrowdTangle Historical Data feature, the DFRLab gathered a sample of posts referencing “Plandemic” since the conspiracy began to circulate and extracted URLs. Given a set of URLs, CooRNet identifies public Facebook entities, such as pages and groups, that repeatedly share the same links within an unusually short period of time.

What constitutes an “unusually short period” is defined by the “coordination interval,” which CooRNet calculates algorithmically. All of the groups indicated in the below network graph fell well within that interval, with groups concentrated in the center displaying particularly intense rapid link sharing.

Network graph showing rapid link sharing of the Plandemic conspiracy to Facebook groups. (Source: @zkharazian/DFRLab via CooRNet and CrowdTangle)

The top URLs shared by these entities were all links to the “Plandemic” video featuring Mikovits on Vimeo and YouTube. The videos had all been taken down by the platforms by the time of analysis.

Top URLs shared by the rapid link-sharing groups identified by CooRNet, by number of shares. (zkharazian/DFRLab via CooRNet and Datawrapper)

CooRNet interprets repeated rapid link sharing as a proxy for coordinated activity. In this case, the DFRLab did not find evidence of consistent coordination on the part of any of the groups or their members. Instead, what this analysis revealed is that the “Plandemic” conspiracy ripped through these groups at an astonishing rate. The pace and volume of link sharing was, at times, so beyond the norm that it mimicked coordinated behavior.

Virtually all the groups that exhibited this pattern were niche conspiracy communities, devoted to topics such as QAnon, chemtrails, and the Reopen America protests. The network structure’s density indicated that the same links promoting the conspiracy were posted rapidly across these various communities, suggesting that the conspiracy had widespread appeal. Independent researcher Erin Gallagher has also mapped these conspiracy communities’ role in the spread of “Plandemic” in her recent Facebook network analysis.

A number of posts copied text from “Plandemic” creator Mikki Willis’ original Facebook post, or referenced the fact that Willis knew his work would be removed.

Mikki Willis’s original post was copied and shared across multiple Facebook pages. (Mikki Willis, left; counterbalancetoday/archive, top right; VaXism/archive, middle right; 4timesayear/archive, bottom right)

Hosting on alternative platforms

Off of the major platforms, several start-up services positioning themselves as “pro-speech alternatives” hosted the video. One of these was BitChute, a videosharing site described by its “officially unofficial” Twitter account, which appeared prominently in the earlier Twitter network analysis, as a “Revolutionary p2p video platform that puts respect for the individual and #freespeech first.”

A single copy of the video hosted on BitChute was seen over 900,000 times and posted on dozens of conspiracy groups across Facebook. An archived search of “plandemic judy mikovits” on BitChute returned 28 results, most of which were copies of the video.

CrowdTangle analysis of an upload of the “Plandemic” video to BitChute. (Source: @zkharazian/DFRLab via CrowdTangle and BitChute/archive)

BitChute’s Community Guidelines cite the U.N. Universal Declaration of Human Rights, and state:

There are numerous things on the internet that many people will find personally distasteful and they are free to avoid and/or criticize those things. We feel strongly that every person must be allowed to share their experiences to raise awareness about issues that are important to them and potentially society as a whole.

Nonetheless, the site indicates that certain types of harmful content, such as child abuse and terrorist content, “will not be tolerated and will be reported to the proper authorities.”

A reverse image search of keyframes from the video also revealed that one copy had racked up tens of thousands of views on Lbry, a video sharing platform dedicated to “content freedom,” and another nearly 800,000 views on BANNED.video, an InfoWars offshoot on which the DFRLab has previously reported. The video also appeared on video-sharing site Rumble with a banner under that read: “Yep! The one platform that won’t take down a video because it goes against the narrative! Free Speech still exists guys!” On video sharing site Brighteon, a search for the video returned dozens of results, some with subtitles in different languages.

Willis linked to the original “Plandemic Movie” website in his Facebook post. The website initially only gave users the option to download part 1 of the documentary in order to re-upload it to other video sharing sites, in an attempt to ensure it stayed accessible. However, it now offers the ability to just view the video clip from the site. The “Plandemic Movie” URL was primarily shared to QAnon Facebook pages, as well as anti-vaxx, pro-Trump and alternative medicine pages. Unlike other links mentioning “Plandemic,” the “Plandemic Movie” URL garnered significant traction on Instagram. Primary engagement came from an account claiming to be operated by Judy Mikovits as well as QAnon and conspiracy accounts.

The “Plandemic Movie” URL gained significant interaction on Instagram, primarily from Judy Mikovits and QAnon-linked accounts. (Source: drjudymikovits/archive, left; wwg1wga_/archive)

Conclusion

An ecosystem of niche conspiracy communities on Facebook and Twitter, coupled with alternative tech platforms casting themselves as “censorship-free” alternatives to the major platforms, sustained the “Plandemic” conspiracy video even as Facebook, Twitter, and YouTube worked to downgrade and remove it.

The entrenched and active presence of conspiracy communities such as QAnon on the mainstream platforms ensured that “Plandemic” continued to circulate in those venues, where it had a higher chance of reaching a mainstream audience, instead of remaining in the more insulated corners of the internet, the domain of the alt tech platforms now hosting dozens of copies of the video.

Zarine Kharazian is Assistant Editor with the Atlantic Council’s Digital Forensic Research Lab (@DFRLab)

Tessa Knight is a Research Assistant, Southern Africa, with the DFRLab and is based in South Africa.

Follow along for more in-depth analysis from our #DigitalSherlocks.

--

--

@DFRLab
DFRLab

@AtlanticCouncil's Digital Forensic Research Lab. Catalyzing a global network of digital forensic researchers, following conflicts in real time.