YouTube Pushes Right-Wing Talking Points. That’s a Problem.

Zhivko Illeieff
Nov 21, 2018 · 9 min read

Since YouTube keeps its recommendation algorithm under wraps, it has been impossible for the regular YouTube user to understand how, and if, the platform funnels our collective attention toward certain talking points and perspectives — until now.

Enter AlgoTransperancy, a program led by Guillaume Chaslot, an ex-Google engineer who was fired after “agitating for change within the company,” according to The Guardian.

After his departure, Chaslot, who spent time working on YouTube’s recommendation engine and on Google’s display advertising, developed AlgoTransparency as a way to extract data from the video-sharing site and provide a snapshot of its recommended content.

For reference, recommended videos on YouTube show up around the content you are viewing; their purpose is to get users to click and watch additional videos, so they can spend more time on the platform.

Why is understanding YouTube’s recommendation algorithm critical for all of us? Because recommended videos account for 70% of what we watch on YouTube, according to the company’s product chief, Neal Mohan.

“We focused a lot in the last several years on machine learning and artificial intelligence to learn what our users like and make,” Mohan said. “Our job is to give the user a steady stream, almost a synthetic or personalized channel,” Mohan said.

This means that the content viewed by YouTube’s nearly 2 billion monthly users is inherently connected to artificial intelligence that routinely prioritizes “watch time” and clicks over credible information, as this essay will illustrate.

AlgoTransperancy put into data what I have experienced for some time while using YouTube.

Lately, I’ve noticed a heavy rotation of right-wing content on YouTube’s homepage and recommended videos. Literally every day I’ve had to block and report accounts that regularly posted videos against feminism, safe spaces, socialism, and other “hot topics” on the right and alt-right.

Since I manage a number of YouTube accounts — some of which I rarely use for browsing YouTube — it became clear to me that, while some recommended videos are due to my viewing history, others are simply favored by YouTube’s algorithm.

Blocking and reporting such videos didn’t stop them from appearing, so I finally chose to use a browser extension that blocks YouTube’s recommendations — you can see the difference below.

  • Before using a recommendation-removing extension:
My YouTube homepage before — filled with alt-right, white nationalist, self-help guru nonsense along with the content to which I am actually subscribed, such as The Real News, Secular Talk, and The Jimmy Dore Show. I must say Tool’s Lateralus is always a good recommendation.
In the “Up next” videos, YouTube mixes content that I like with content that I am not subscribed to and have reported/blocked from my feed.
  • After using a recommendation-removing extension:
The YouTube homepage experience without the invisible hand of the algorithm.
No “Up next” videos means Bill is happy.

Content is King (and no one is talking about it)

YouTube’s algorithm pursues one simple goal: to maximize watch time. If viewers are watching the videos for longer, it signals Google they’re “happier” with the content they’ve found.

Chaslot describes this as YouTube’s A.I. being “neutral towards clicks“ and partisan toward videos with questionable content, but a lot of engagement:

If the belief that “the earth is flat” makes users spend more time on YouTube than “the earth is round”, the recommendation A.I. will be more likely to suggest videos that advocate the former theory than the latter.

Searching from “is the earth flat or round?” and following recommendations five times, we find that more than 90% of recommended videos state that the earth is flat.

A look into YouTube’s recommended videos through AlgoTransperancy’s website reveals that conspiracy theories, right-wing and pro-corporate propaganda, and aggressive, anti-left sentiments are recommended more than any other kind of content.

While my opinion might appear partisan or a bit harsh, I welcome anyone to provide an alternative description for the sensationalist, anti-feminist, anti-progressive propaganda that appears on YouTube’s most recommended videos:

Videos with aggressive and hyperbolic titles, such as “so an so DESTROYS feminist,” “murder porn” documentaries, and videos from Fox News are commonly recommended.

If you think I am being selective, here are some other current recommendations:

The algorithm often features videos of right-wing pundits like Shapiro and Malkin — both paid speakers of the Koch-funded conservative youth organization Young America’s Foundation, and white nationalist Steve Bannon. Compared to the demeaning headlines about progressive viewpoints, YouTube’s recommended news about Trump are neutral to positive.

Due to my own experience with YouTube, I wasn’t surprised when I encountered the lists above. In fact, I saw that many of the same videos I tried to eradicate from my YouTube feed, such as Steve Bannon’s recent “debate,” were actively recommended by the platform.

What is surprising about YouTube’s algorithm is the lack of interest in the topic from those concerned with “Russiagate,” even though data presented by AlgoTransperancy is perhaps the most clear evidence of a massive misinformation campaign during the 2016 U.S. presidential elections.

According to Chaslot, data from YouTube’s algorithm during the election showed that “more than 80% of recommended videos were favorable to Trump, whether the initial query was ‘Trump’ or ‘Clinton’,” and that a large proportion of those recommendations “were divisive and fake news.”

“YouTube is the most overlooked story of 2016,” Zeynep Tufekci, a sociologist and technology critic, tweeted in October 2017. “Its search and recommender algorithms are misinformation engines.”

YouTube’s leadership has acknowledged the platform’s issues, but has stopped short of actually pointing out the dangers of the platform’s skewed recommendation system.

“This is the impact of an open platform: it brings the world together in ways that were just not possible before. But we’ve also seen that with openness comes challenges, as some have tried to take advantage of our services,” YouTube CEO Susan Wojcicki said in May of 2018. “There is not a playbook for how open platforms operate at our scale…the way I think about it is that it’s critical that we’re on the right side of history.”

As a result of recent criticisms, YouTube has introduced human-curated videos for their YouTube Kids section, and pledged to hire 10,000 moderators.

We can only hope that being on the right side of history also means that YouTube will soon address the fact that its recommendation algorithm favors conspiracy theories about deadly shootings in the U.S., flat earth theories, aggressive, anti-left sentiments, and fake news about presidential elections, among other problematic content.

It is time to ask why a simple search for something as neutral as the phrase “millennal” brings up content from white nationalist extremists like the “Proud Boys,” right-wing political operatives from Young America’s Foundation (who are often recommended on YouTube), and consumerist propaganda outlets.

It’s time to understand why searching for “liberal” recommends factual narratives of the history of neoliberalism, alongside content from “OverweightF*ck” (clearly a favorite of the Algorithm), Candace Owens (who works for the conservative youth organization Turning Point USA), and Ben Shapiro, another often recommended Young America’s Foundation-paid pundit.

As one would expect from the examples above, the phrase “socialism” isn’t going to recommend users a lecture by someone like Marxist professor Richard Wolff, but acclaimed “intellectuals” like Ben Shapiro, Tucker Carlson, “SJWCentral,” Steven Crowder, “Zane,” Stefan Molyneux, and a Joe Rogan Podcast fan channel.

YouTube’s taking a side — so why aren’t we?

Given YouTube’s tilt toward sensational, long-form content, it is no surprise that right-wing operatives found a way to hijack the attention of YouTube users and induce the same fear and neoliberal talking points as Fox News and Breitbart.

This is not to say YouTube is purposefully pushing Ben Shapiro and Jordan Peterson to its users. But that is also the problem. How can YouTube be accountable for anything on the platform when executives can blame a supreme algorithm, their profit-based pursuit for engagement, and thousands of part-time content scrubbers every time the platform’s biases are exposed?

As Facebook continues to go down the digital drain following its own lack of interest in protecting its users’ health and data, more people are flocking to YouTube, a video-sharing site, for their news and educational needs.

The fact that content recommended by YouTube advances a very particular set of talking points — opposing feminism, social justice, left-wing politics, and marginalizing immigrants, people of color, and the LGBTQ community— is a clear sign that something underneath YouTube’s hood either needs to be changed immediately, or is working just as intended.

One advice from Data & Society’s “Alternative Influence: Broadcasting the Reactionary Right on YouTube” report is that YouTube “needs to not only assess what channels say in their content, but also who they host and what their guests say.” “In a media environment consisting of networked influencers,” states the report, “YouTube must respond with policies that account for influence and amplification, as well as social networks.”

Failure to rein in YouTube’s influence on its users, and the influence that reactionary networks exert over their shared subscribers, could be catastrophic the next time The Algorithm “decides” to take a side on how it recommends election-related content.

What would happen, for example, if the channels that are currently favored by YouTube’s recommendation algorithm — “Overweightf*ck,” “Lieutenant’sLoft,” “Patriotism Show,” etc. — start supporting Trump more explicitly and smear his 2020 Democratic opponent as evil socialists and neo-Marxists?

We are practically there, because many of the same actors whose shows and personalities are amplified on YouTube are also part of the networks of tech, media, and news companies that have slowly normalized Trump’s propaganda.

To be clear, while I support the right of those videos and accounts to be on YouTube, I refuse to be subjected to the whims of its “racist uncle” algorithm every time I use the platform. Progressive content creators must absolutely demand that YouTube and its parent company Google become more transparent about their recommendation engine and explain why their platform can be so easily used to spread disinformation.

Here’s what Chaslot and his team recommend to make YouTube’s A.I. more transparent:

What does YouTube recommend in average?
Giving users the option to see a random recommendation made on a given day, as can be done on Twitter and other sites, would enable the public to assess YouTube’s A.I. general alignment.

Does the A.I. favor specific videos?
To answer this question, three variables are needed: the number of views, the number of views resulting from AI recommendations, and the total number of recommendations (the number of views on a video is already public.)

Until there’s more clarity on those issues, I will keep my YouTube experience recommendation and distraction-free, through browser extensions like Nudge, and support platforms that don’t put clicks over common sense.

Update (January 2019): The Washington Post: “YouTube is changing its algorithms to stop recommending conspiracies:”

YouTube said Friday it is retooling its recommendation algorithm that suggests new videos to users in order to prevent promoting conspiracies and false information, reflecting a growing willingness to quell misinformation on the world’s largest video platform after several public missteps.

In a blog post that YouTube plans to publish Friday, the company said that it was taking a “closer look” at how it can reduce the spread of content that “comes close to — but doesn’t quite cross the line” of violating its rules. YouTube has been criticized for directing users to conspiracies and false content when they begin watching legitimate news.

The change to the company’s so-called recommendation algorithms is the result of a six-month long technical effort. It will be small at first — YouTube said it would apply to less than one percent of the content of the site — and only affects English-language videos, meaning that much unwanted content will still slip through the cracks.

Zhivko Illeieff

Written by

Writer & media maker. Essays on culture, tech, media, politics, and propaganda. Bulgarian American. Follow my work at themeltage.com.