Does YouTube Radicalize Its Users?

Sharlene McKinnon
5 min readFeb 6, 2020

--

Photo by Matthew Henry from Burst

While at ACM FAT in Barcelona, I attended a series of talks about how AI can be used to audit technologies to understand particular outcomes better.

In the past few years, the internet has seen a rise in the online presence of fringe movements such as the alt-right white supremacist movement. People have noticed and proposed that platforms like YouTube serve as a radicalization tool for these movements.

In 2018, Zeynep Tufekci wrote an opinion piece for The New York Times titled “YouTube, the Great Radicalizer.” In the opinion piece, she illustrated how YouTube’s algorithm was pushing users, particularly young men, towards extreme content.

In the article, she proposed that “Its algorithm seems to have concluded that people are drawn to content that is more extreme than what they started with — or to incendiary content in general.”

She points to a former YouTube employee, Guillaume Chaslot, who became disenchanted by the “tactics used to increase the time people spent on the site.” Further research was done with the Wall Street Journal that showed that “YouTube often “fed far-right or far-left videos to users who watched relatively mainstream news sources.”

A year later, Kevin Roose, introduced the world to Caleb Cain, a young man who was drawn into the alt-right by far-right YouTube personalities and his struggles to find his way out.

Roose proposes that “YouTube has inadvertently created a dangerous on-ramp to extremism by combining two things: a business model that rewards provocative videos with exposure and advertising dollars, and an algorithm that guides users down personalized paths meant to keep them glued to their screens.”

Since the appearance of these two articles, there’s been a fair bit of “Does it? Doesn’t it?” back and forth in the media. But, the question is still out there: Does YouTube radicalize its users?

In a paper titled “Auditing radicalization pathways on YouTube” presented by Manoel Horta Ribeiro with co-researchers from École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland and the Federal University of Minas Gerais in Brazil, these researchers dig deep into YouTube’s algorithm to determine whether or not radicalization pathways exist on YouTube at scale.

The team looked at the digital traces that people leave on the internet: likes, comments, and views. Using a dataset of 349 channels, with 330,925 published videos, and 72 million comments, the researchers divided the channels into four groups:

  • the alt-right, which is the radical community studied,
  • the alt-lite and intellectual dark web (IDW), which have been associated with the alt-right and are considered gateway communities by the Media and NGOs, and
  • media channels ranging from the New Yorker to the Russia Insider, as a comparison group used for the analysis of data collected.

The team used the data above to analyze the content trajectories taken by users who comment on and are drawn to extreme content.

The hope was to identify points of intersection between the users of the three communities listed above to determine whether or not there is a radicalization pipeline.

Examining the Pipeline

The first question asked was: What similarities exist between the different communities based on the viewed content and comments?

What they found is that similarities exist between the alt-lite and the intellectual dark web users, but not between these two groups and the alt-right.

Users of these gateway communities follow roughly the same trajectory through YouTube but not with the alt-right. But, between 2016–2018, researchers did see some similarities between all three communities growing.

With this information, researchers dug deeper into the data to track individual users who only commented in the gateway communities and, in subsequent years, looked at how they engaged with the alt-right.

From this, they identified three different levels of exposures based on the number of comments:

  • users who commented one to two times were lightly exposed,
  • users who commented three to five times were mildly exposed, and
  • users who commented more than five times were severely exposed.

They started with a set of users who only commented in gateway communities (alt-lite, intellectual dark web). The researchers then traced these users in subsequent years to understand exposure level to the alt-right.

The Findings

Researchers designed a set of different user scenarios and tracked users who initially commented only in gateway communities and compared their trajectory with users who only commented on media channels.

What they found is that the exposure to the alt-right was much higher for people who commented initially in gateway communities.

  • For commenters in the gateway communities: 10% of them were lightly exposed to the alt-right, with 3% being mildly or severely exposed.
  • For those who started with media channels: 4% of them were lightly exposed to the alt-right, and 1% of them mildly or severely exposed.

Researchers were able to determine that users who commented on these gateway communities are more likely to comment on the alt-right.

The next question to answer was: what percentage of the users who are in the alt-right went through the gateway community pipeline?

To find an answer, they plotted the percentage of users who were committed commenters to the alt-right channels for all levels of exposure, and from this examined the percentage of those users who went through a radicalization pipeline for both the gateway communities and the media.

What they found is:

  • For those who started with media channels: only 5% of users went to the alt-right.
  • For commenters in the gateway communities: 40% of users went to the alt-right.

This finding is important because it shows that YouTube users who comment in the gateway communities are more likely to be exposed to alt-right content and eventually become active community members.

Thus, researchers were able to find evidence that YouTube is serving as a radicalization tool for these movements. But unfortunately, they were also unable to determine why this is happening.

Thank you to the authors for sharing their research. You can read the full paper on arxiv.org.

--

--

Sharlene McKinnon

Geek. Multiplier. Leader & Mentor. Digital Humanities. I work at the intersection between humans + technology.