Who Benefits from Health Misinformation?

This pandemic exposes the erosion of our collective trust in health expertise

Erin McAweeney
Data & Society: Points
4 min readMar 30, 2020

--

Abstract image of a web of purple dots floating in a dark space

It’s clear who loses from health misinformation. Mothers lose children after seeking advice in Facebook free-birth groups, measles break out in anti-vaxx communities, children with autism are poisoned from being fed bleach marketed as a miracle cure, and people die from misleading hope about a fake COVID-19 cure.

The answer as to who profits from the casualties of health mis- and disinformation is less obvious than who loses. It’s a question I’ve been getting asked a lot, and it isn’t as straightforward an answer as, say, foreign interference in an election to usher a favored candidate into office. Campaigns run by foreign actors meddling with election integrity have a linear thread of content that can be traced back to the political power at the center. The sales of doomsday preparation gear and attention garnered for making bold claims online about the coronavirus spread have clear motivations, but the long-term effect of health mis- and disinformation is more insidious. It’s both of a question of who benefits now — a profiteering televangelist or a growing movement against vaccination — but also who profits later from the gradual erosion of institutional belief.

Different groups with different motives are exploiting the COVID-19 pandemic in different ways. I’m a senior analyst at Graphika, a social media network analysis firm, where we map “cyber-social terrain” and the information that flows through them. To date, we’ve found online communities from health topics, political groups, and social identity groups pushing misinformation on COVID-19: grifter televangelists, QAnon, MAGA Twitter, anti-vaxxers, conservative and anti-CCP politicians and billionaires, and anti-immigration parties in France and Italy. These groups frame and misrepresent the issue to fit their ideological goals.

Panicked information-seekers are more likely to consume that problematic information at a faster rate and higher volume, rather than wait in uncertainty.

Bad actors intending to spread problematic content can easily capitalize on an information ecosystem full of panicked information-seekers in a health crisis. For example, the constant stream of breaking news during the first few weeks of this pandemic created novel keyword searches and search engine data voids waiting to be filled with information. Information that can take a while to fact-check. But fact-checked information can take longer to fill that void than the bio-weapon conspiracy theory that was already heavily circulated in Graphika’s maps. And definitive resolutions, like a vaccine, can take even longer to produce — slower than a magical silver colloidal cure that was, before this crisis, already being popularized as a “cure” for cancer. Panicked information-seekers are more likely to consume that problematic information at a faster rate and higher volume, rather than wait in uncertainty. This is how an information ecosystem online is built to further undermine our collective trust in health experts and institutions.

A Graphika map that visualizes the networked conversation of coronavirus disinformation on Twitter.

Eventually, with enough doubt sowed in institutions, people will unwittingly do the work of creating misinformation for bad actors. Links to “tracking” tools and maps of the coronavirus were the top four URLs shared in the conspiratorial clusters in Graphika’s February coronavirus map — the BNO News tracker was shared 1,965 times alone. The popularity of home-brewed trackers and maps tie into conspiracies around death tolls, mistrust of established statistics on the virus, and a larger misbelief in health data integrity. For example, consider the user that tweeted out their personal investigation using a weather tracking site windy.com and satellite imagery of the now-debunked rising sulfur dioxide rates in Wuhan. Although the investigation was flawed, the tweet thread was shared over 800 times in one day across our coronavirus map, and at least two news outlets covered the story. The investigation was likely inspired by a conspiratorial Epoch Times article written a few days prior with “exclusive” interviews from Chinese medical practitioners about the allegedly secret crematoriums.

A public too fragmented to collectively trust health experts can’t hold an administration accountable for its lies.

A larger destabilization of health and scientific institutions is at work. Belief in one health conspiracy is more likely to lead to a larger rejection in science. It shouldn’t be a surprise that there was an 80% increase in the anti-vaxxer accounts Graphika has collected participating in the coronavirus conversation. So, hypothetically, when a presidential administration is being held increasingly accountable for enabling the worst outbreak of COVID-19 worldwide, it’s not hard to imagine that same administration deflecting those accusations with misinformation — rumors of a Chinese-engineered bioweapon, for example. A public too fragmented to collectively trust health experts can’t hold an administration accountable for its lies. The grifters and snake oil salesmen are profiting now, but the uncertainty sowed today paves way for an oppressive power to take advantage of a fragmented society much more vulnerable to misinformation in the future.

Erin McAweeney is a senior research analyst at Graphika and an affiliate at Data & Society, where she previously worked on the Health & Data research team.

--

--