Social Media in the Era of COVID-19

By Lorien Abroms and Lailah Fritz

Digital technologies have become part of the fabric of daily life in the United States and in much of the world. In normal times, social media use is widespread. Most U.S. adults use at least one social media platform, and most check in at least daily — if not several times a day. During the stay-at-home orders associated with the COVID-19 pandemic, social media use is at a record high.

Why should we care about this in the era of COVID-19?

We should care because social connections and social support are vital for social and emotional well-being, and during this time of stay-at-home orders, social media is a main way of connection for many people, especially for those who live alone. There is some evidence (from prior to COVID-19) that social media use can be nurturing, but also evidence that heavy users of social media are more likely to experience depression and other mental and physical health issues. It remains unclear how connections through social media are working in this period, but are likely not sufficiently nurturing.

Another reason we should care about time spent on social media is because social media platforms are key information sources, both of facts and of norms. In a pandemic, people need reliable and trustworthy sources of information. Right now, there is a lot of misinformation about COVID-19 on social media, whether on Reddit, TikTok, or other platforms. Sensational claims of liquids that can “immunize” against the virus or of “miracle mineral solutions” abound. Videos such as “Plandemic,” which falsely claims that the pandemic stems from profit-making motives of drug and vaccine companies, have been readily shared on social media, despite efforts by social media companies to stop them. The World Health Organization has described the situation of misinformation on social media as an “Infodemic.”

Historically, social media companies refused to influence the health conversations on their platforms claiming that correcting health or other types of information was not their responsibility. This changed somewhat with the measles outbreak in 2018. Pinterest led the way by prohibiting false health information in their Terms of Use in 2019, and Facebook followed with specific changes for vaccine-related content and now COVID-19 related misinformation. For COVID-19, Facebook has put up banners directing people to the WHO or CDC and is using human fact checkers to flag or remove misinformation related to COVID-19, as well as funding fact checkers associated with The International Fact-Checking Network. These steps are notable and unprecedented, but may not be enough.

And often, we make sense of health-related information through norms, by looking to our friends and family to see how they are thinking and behaving. We look on social media to see if our friends are obeying the stay-at-home orders — do we see posts of them at their home or on a crowded beach? If we belong to groups, we may look at posts in those groups for making sense of the news. There is evidence that some groups or posts, while seemingly organic, may be in fact organized and aimed at promoting political interests. In the past, this was the case in research that showed that anti-vaccination comments on Twitter were linked to Russian trolls aimed at creating discord in the U.S. In the case of COVID-19, there have been similar reports that social media groups that are seemingly non-partisan and organic such as those aimed at protesting stay-at-home orders are in fact organized by political groups. This is concerning because the funders of these groups and intentions behind them are difficult to discern, and these may sway our beliefs and norms against public health goals. At present, there are few, if any, limitations on such groups.

It’s easy to assume that this information and social environment is there for better or worse, and there is little that we, as public health practitioners, can do about it. After all, social media conversations are occurring in digital spaces governed by private companies. We take issue with this and argue that the job of public health is to shape the environment so that it is health promoting, no matter whether the environment is physical or digital, private or public. We can advocate for better standards in the removal of misinformation. We can argue for better policies in how groups establish themselves and what they disclose about themselves. There are many other places for changes including in the Terms of Use and Community Guidelines; advertising policies; search algorithms; and the provisions and content of health banners. Furthermore, as new information policies are introduced on these platforms, we need on-going surveillance to systematically examine their effects on health outcomes.

Taking these steps — insisting on better information environments and evaluating the current ones — is our public health duty and an important aspect to confronting COVID-19 and other future pandemics.

Lorien Abroms, ScD, is a professor of prevention and community health at Milken Institute School of Public Health at the George Washington University. She is director of the school’s MPH in Public Health Communication & Marketing program.

Lailah Fritz is Lorien Abroms’ daughter and a public health student at the University of Michigan.

--

--

GW Milken Institute School of Public Health
gwpublichealth

Voices from the only school of public health in the nation’s capital. Learn more: publichealth.gwu.edu