Far-right news sources on Facebook more engaging

Unlike other news across the political spectrum, no “misinformation penalty” for far-right pages

--

by Laura Edelson, Minh-Kha Nguyen, Ian Goldstein, Oana Goga, Tobias Lauinger, and Damon McCoy.

Facebook has become a major way people find news and information in an increasingly politically polarized nation. We analyzed how users interacted with different types of posts promoted as news in the lead-up to and aftermath of the U.S. 2020 elections. We found that politically extreme sources tend to generate more interactions from users. In particular, content from sources rated as far-right by independent news rating services consistently received the highest engagement per follower of any partisan group. Additionally, frequent purveyors of far-right misinformation had on average 65% more engagement per follower than other far-right pages. We found:

  • Sources of news and information rated as far-right generate the highest average number of interactions per follower with their posts, followed by sources from the far-left, and then news sources closer to the center of the political spectrum.
  • Looking at the far-right, misinformation sources far outperform non-misinformation sources. Far-right sources designated as spreaders of misinformation had an average of 426 interactions per thousand followers per week, while non-misinformation sources had an average of 259 weekly interactions per thousand followers.
  • Engagement with posts from far-right and far-left news sources peaked around Election Day and again on January 6, the day of the certification of the electoral count and the U.S. Capitol riot. For posts from all other political leanings of news sources, the increase in engagement was much less intense.
  • Center and left partisan categories incur a misinformation penalty, while right-leaning sources do not. Center sources of misinformation, for example, performed about 70% worse than their non-misinformation counterparts. (Note: center sources of misinformation tend to be sites presenting as health news that have no obvious ideological orientation.)

Our findings are limited by the lack of data provided by Facebook, which makes public information about engagement — reactions, shares, and comments — but not impressions — how many people actually saw a piece of content, spent time reading it, and so on. Such information would help researchers better analyze why far-right content is more engaging. Further research is needed to determine to what extent Facebook algorithms feed into this trend, for example, and to conduct analysis across other popular platforms, such as YouTube, Twitter, and TikTok. Without greater transparency and access to data, such research questions are out of reach.

Finally, we rely on NewsGuard and Media Bias Fact Check as two independent resources that rank news sources by bias and reliability; we do not ourselves define which sites do or don’t traffic in dis- and misinformation, nor do we make judgments about political leanings. We also don’t make judgments about what is or isn’t “news”; the Facebook page content we include is from sources that present themselves as news and have ratings from these independent rating organizations.

Methodology

To study our research question — how users interact with news and information on Facebook — we obtained lists of U.S. news sources and their Facebook pages from NewsGuard and Media Bias Fact Check, two independent data providers that survey the news ecosystem and rate the political leaning and quality of media. In total, we were able to identify 2,973 news and information sources with both a political and quality evaluation from either NewsGuard or Media Bias Fact Check, as well as a Facebook page with an average of more than 100 followers.

We then downloaded all public posts between August 10th, 2020, and January 11th, 2021 from these Facebook and Instagram pages using Facebook’s CrowdTangle tool, a total of 8.6 million posts. From CrowdTangle, we collected data on how many users follow these pages, and how many likes, comments, or other interactions each of these pages’ public posts have garnered.

Engagement on Facebook and Instagram with far-right news sources was particularly pronounced on Election Day and again on January 6, the day of the U.S. Capitol riot.

Analysis reveals more engagement with extreme content

When we look at interactions per follower of a news page from the perspective of political leaning, we observe that content from sites rated as politically “extreme” tends to generate more interactions. This is consistent with findings from other researchers of social media in the 2020 elections. (Goldstein, The German Marshall Fund; Nilolav et al., Harvard Misinformation Review.)

Particularly, news sources rated as far-right generate the highest average of interactions per follower with their posts, followed by news outlets from the far-left, and then publications closer to the center of the political spectrum. To determine this, we added up all the interactions with posts from each political leaning and divided them by the sum of followers the respective pages had that week.

Far-right and Far-left news sources peaked on key dates; less intense for other news sites

Also worth noting is that engagement with posts from far-right news sources peaked both around Election Day and on January 6, the day of the certification of the electoral count, as well as the U.S. Capitol riot. For posts from all other news sources, the increase in engagement was much less intense. In the week of January 6th, for example, far-right news sources generated just over 500 interactions for every thousand followers of the page. Slightly-right or slightly-left news sources reached only around 150 interactions per thousand followers that week.

Far-right news sources suffer no “misinformation penalty”

Given the widely reported disinformation around election fraud in the lead-up to the election and the extremism that surfaced at the Capitol, we were concerned about news sources that spread misinformation and conspiracy theories. Both Media Bias Fact Check and NewsGuard sources evaluate, among other criteria, whether a news source is a consistent spreader of misinformation or conspiracy theories.

For far-right news sources, misinformation significantly outperforms non-misinformation; for all other political leanings, there is a misinformation penalty resulting in lower engagement per follower.

When we look only at the far-right, we see that misinformation sources significantly outperform non-misinformation sources: Far-right sources designated as spreaders of misinformation had an average of 426 interactions per thousand followers per week, while non-misinformation sources had an average of 259 weekly interactions per thousand followers. Being a consistent spreader of far-right misinformation appears to confer a significant advantage.

All other partisan categories, however, do incur what we call a misinformation penalty, although the penalty for slightly-right sources was very small. Center sources, for example, averaged 79 weekly interactions per thousand followers of non-misinformation pages compared to an average of 24 for misinformation sources; center misinformation sources performed about 70% worse than their non-misinformation counterparts. (We define misinformation penalty as a measurable decline in engagement for news sources that are unreliable.) Here, we note that many of the sources flagged as both ‘center’ and ‘misinformation’ are primarily focused on health topics and therefore do not fit neatly into a “right” and “left” ideological framework.

Directions for future research

These preliminary results raise some interesting questions for future work.

  • How do far-right news publishers manage to engage more with their readers, and why does the misinformation penalty not apply to that category?
  • We have to be careful about what these data do and don’t tell us. For example, we can’t say anything about how many people saw any of the posts we are examining. Facebook’s CrowdTangle provides researchers with information about engagement, reactions, shares, and comments, but not impressions — how many people actually saw a piece of content, spent time reading it, and so on.
  • Further research is needed to determine to what extent Facebook algorithms feed into this trend, for example, and to conduct analysis across other popular platforms, such as YouTube, Twitter, and TikTok. Without greater transparency and access to data, such research questions are out of reach.
  • Lastly, we looked at engagement relative to the pages’ follower base; higher relative engagement for far-right sites doesn’t imply that there’s more far-right content on Facebook, or that Facebook users prefer far-right content.

In conclusion, we found that far-right sources receive considerably more engagement per follower than pages with other political leanings. Furthermore, far-right misinformation sources are the only ones that engage better with their followers than non-misinformation sources of the same partisanship as an aggregate. We’re looking forward to learning more about the news ecosystem on Facebook so that we can start to better understand the whys, instead of just the whats.

More methodological notes

We took news source evaluations from Media Bias Fact Check and NewsGuard. These sources provide detailed qualitative information about many aspects of these news sources, but we focused on two things: partisanship, and whether the source had a history of spreading misinformation or conspiracy theories. News source evaluations were taken once at the end of the study period and are static.

Partisanship

Media Bias Fact Check categorizes the partisanship of media sources into 7 categories: Extreme Right, Far Right, Right, Center, Left, Far Left, Extreme Left. NewsGuard categorizes the partisanship of news and information sources into 5 categories: Far Right, Slightly Right, N/A, Slightly Left, Far Left. We harmonized these categories as follows:

Harmonized ratings for news source content used in our analysis, based on NewsGuard and Media Bias Fact Check ratings
Harmonized ratings for news source content used in our analysis, based on NewsGuard and Media Bias Fact Check ratings.

Quality

Both Media Bias Fact Check and NewsGuard had multiple types of quality evaluations, but for purposes of simplicity, we considered only whether the evaluation stated a history of spreading misinformation or conspiracy theories. Both of these sources used different terms to capture the spectrum of misleading or questionable news practices, but both used the term ‘Conspiracy’ and ‘Fake News’ to represent the far end of that spectrum. If there was a mention of either ‘Conspiracy’ or ‘Fake News’ in the description from either Media Bias Fact Check or NewsGuard, we set our measure, “MisinformationOrConspiracy” equal to true. We’re unable to say if any one piece of content is or isn’t misinformation; all evaluations are on the source level.

Facebook Posts

We used CrowdTangle to download public posts and interaction statistics for the Facebook Pages we were able to identify as being associated with U.S. news sources. We matched Facebook pages to their evaluation either by matching the Facebook page ID or alias noted in the evaluation, or by matching the verified domain associated with the page to the domain noted in the evaluation. We downloaded public posts from these sources for the period of August 10th, 2020 — January 11th, 2021. We discarded sources that had fewer than 100 followers.

Laura Edelson is a PhD candidate in Computer Science at NYU’s Tandon School of Engineering. Laura studies online political communication and develops methods to identify inauthentic content and activity.

Minh-Kha Nguyen is a second-year Ph.D. student at Grenoble Alpes University, working with Oana Goga. His focus is on analyzing news dissemination on social media and investigating its negative impacts.

Ian Goldstein Ian Goldstein is a first-year Ph.D. student at NYU Tandon, working with Damon McCoy.

Tobias Lauinger is a research assistant professor at the New York University Tandon School of Engineering. His research focuses on transparency and policy enforcement efforts of online social networks, and other aspects of Internet security, privacy, and cybercrime.

Oana Goga is a tenured research scientist at the French National Center for Scientific Research (CNRS) and the Laboratoire d’Informatique Grenoble (LIG). She investigates how social media systems and online advertising can be used to impact humans and society negatively.

Damon McCoy is an assistant professor of Computer Science and Engineering at the New York University Tandon School of Engineering.

--

--

Cybersecurity for Democracy
Cybersecurity for Democracy

Cybersecurity for Democracy is a research-based effort to expose online threats to our social fabric. We are part of the Center for Cybersecurity at NYU.