Facebook’s Response to COVID Misinformation on its Platform Has Been Pathetic
It is Further Evidence that the Social Media Company May Be Too Big for its Own Good
Vaccine Rates are Slowing Preventing the end of COVID in the US
With normal life resuming in many parts of the country, the path out of this pandemic in the US is clear: reach herd immunity through COVID vaccines. However, many people in the US are refusing to get vaccinated, leading to a slowing vaccination rate across the country.
Rates have declined from a high of 2.7 million partial and 2.4 million fully vaccinated people per day in April to a low of 360k partial and 260k fully vaccinated people as of July. This is despite the US being nowhere near the level required for herd immunity, especially given the spread of the more contagious delta variant.
The White House Lashes Out at Facebook and Facebook Retaliates
Slowing vaccination rates, especially given his focus on getting Americans vaccinated as quickly as possible, may have led to Biden’s comments about Facebook causing deaths due to COVID misinformation on its platform.
Facebook shot back by saying they have saved lives, citing the number of people viewing facts about the pandemic, including where to get a vaccine, on its platform. They also cited the steps they have taken to reduce misinformation.
This response is completely meaningless. Given Facebook’s penetration in the US (about 70% of US adults are on the platform and 70% use it daily) there will naturally be a large number of people who view correct information about vaccines and COVID. But this says nothing about the amount of misinformation on the platform.
Furthermore, the people viewing this correct information are likely not the vaccine skeptics that are keeping the vaccination rate low. Facebook could be doing way more to ensure these individuals that regularly engage in vaccine skepticism see more approved content that could sway them. They could work with researchers to develop content that promotes the effectiveness of vaccines in stopping the pandemic, but Facebook has not done much of this. Rather, they just lash out at the White House claiming the problem lies with them.
Even some doctors agree that Facebook is causing deaths due to misinformation on the platform, and research suggests that the majority of vaccine misinformation comes from only a few accounts, which Facebook could easily reduce the spread of.
Facebook’s Inadequate and Frankly Lazy, Response
Casually using Facebook’s platforms shows it is clear that they could do way more to prevent the spread of misinformation.
Facebook has taken some minimal steps. Today, when you search on Facebook for things like “Covid vaccines unsafe”, you see a banner at the top that highlights Facebook’s COVID information center which provides fact based information about COVID-19. However, it provides nothing related to vaccines and their efficacy in preventing transmission or severity of symptoms.
This is quite lazy. Rather than matching the type of information to the query being searched for, Facebook provides a generic response that is irrelevant to what is being searched.
Anyone searching this would clearly just try to find information another way, especially since there is no readily found information about vaccines. Facebook missed an opportunity to provide credible, relevant results.
In addition, searching on Instagram for things like “vaccines” and “covid vaccines” returns accounts of vaccine skeptics such as “vaccine_transparency”, “covid_ 19_vaccine_nt”, and “the_covidvaccine_informer” in the top search results.
Here is an example of what the results for “covid vaccines” look like:
And here are what posts by the search result “covid_ 19_vaccine_nt” look like. They are clearly questioning the efficacy of the COVID-19 vaccine:
On the contrary, searching for “covid vaccines unsafe” produced no results except for the WHO instagram page.
But again this demonstrates sheer laziness in the response by Facebook to combating misinformation. How hard is it for Facebook to curate all search results containing the word “vaccine” to only return approved results from scientific organizations? (Not to completely pick on Facebook, doing similar things on Youtube produced similarly troublesome results).
Another social media company has done exactly this.
Searches for these things on Pinterest return a much better response (full disclosure I currently work at Pinterest, but it is clear their response to misinformation is much better).
All vaccine related posts contain a declaration at the top that highlights how these queries often have misinformation and therefore only return results from trusted organizations. Facebook could easily implement something like this.
Why This Matters
Facebook is such a large platform that it generates what economists call externalities. These are additional benefits and costs that are not priced in by the people who use Facebook or by Facebook itself in the free market.
Specifically, in the context of vaccines, people who view misinformation on Facebook and become less likely to get vaccinated do not just affect themselves but also negatively harm the people around them. Not getting vaccinated raises the likelihood that they transmit the virus, get hospitalized and impose costs on others, or allow room for the virus to mutate into new, more dangerous variants like the delta variant.
Facebook is not incentivized to correct this failure because it does not directly affect its business. The firm could view this problem as their social responsibility, but the changes they have implemented, as seen above, and their antagonistic response to the White House show they have abdicated responsibility for this negative externality.
Clearly, Facebook cannot be entrusted with the influence generated by its dominant position. This is another data point that suggests Facebook is too big for its own good.
At its current size Facebook generates negative externalities that it is not willing to correct, even if they only require simple fixes that other companies are implementing. Worse yet the COVID misinformation saga is just one of many. If Facebook was broken up, the firm would be easier to regulate and also face different incentives to potentially correct these issues on its own.