YouTube’s celebrity culture and community dynamics play a major role in the amplification of far-right content

Image: Unsplash/Oleg Laptev

In recent years, the media has sounded a constant drumbeat about YouTube: Its recommendation algorithm is radicalizing people.

First articulated by Zeynep Tufekci in a short piece for The New York Times, and later corroborated by ex-Google employee Guillaume Chaslot, the theory goes something as follows: YouTube, in its wish to keep eyeballs glued to its platform, nudges people to more and more extreme content over time. For most types of content, this trend can be harmless, but in the case of political content, it can drive people down “algorithmic rabbit holes” to conspiracy theories or white supremacist propaganda.

This…


We don’t have the answers to stop terror attacks being live streamed, but that doesn’t mean we shouldn’t ask questions

An image taken from the live stream of the shooter in Halle yesterday.

In March, there was the Christchurch massacre, when a gunman gleefully referenced popular memes while live streaming himself shooting dozens of Muslim community members. In April, there was the Poway, California synagogue shooting, when a Christchurch copycat attempted to live stream himself in the same manner, failed, but still managed to kill a woman and injure three others.

With the Halle synagogue shooting in Germany yesterday, live streamed broadcasts of the mass murders of Muslim and Jewish communities have officially become a trend.

I am an academic who studies the far-right online, and after these tragedies, people often ask me…


Becca Lewis is a Ph.D student in communication working at Data & Society. Below she writes about findings from a recent Knight report that explored how misinformation spread during the 2016 presidential election.

If you start paying attention to the issue of online disinformation, you will start to hear a lot about the role of “influence.” Most notably, media outlets have done widespread reporting on Russia’s so-called “influence campaigns,” meant to impact U.S. elections. But “influence” is an important online phenomenon more generally. If you use Instagram, for example, you almost certainly have encountered “brand influencers,” who build devoted audiences…


Becca Lewis is a Ph.D student in communication working at Data & Society. Below she writes about findings from a recent Knight report that explored how misinformation spread during the 2016 presidential election.

If you start paying attention to the issue of online disinformation, you will start to hear a lot about the role of “influence.” Most notably, media outlets have done widespread reporting on Russia’s so-called “influence campaigns,” meant to impact U.S. elections. But “influence” is an important online phenomenon more generally. If you use Instagram, for example, you almost certainly have encountered “brand influencers,” who build devoted audiences…


Media manipulation outside the far-right

In May, Alice Marwick and I released Media Manipulation and Disinformation Online — a report that showed how certain far-right online communities work together to manipulate mainstream media narratives and spread misinformation. These groups are often called the “alt-right” but are more accurately a loose collection of internet trolls, conspiracy theorists, white nationalists, and misogynists. Members of these subcultures work in combination with right-wing media personalities and outlets — and even elected officials like Donald Trump — to amplify misleading or false stories and messages.

Our report focused almost exclusively on far-right groups, and many readers asked us whether anything…

Becca Lewis

I research media manipulation and political digital media at Stanford and Data & Society.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store