Should the YouTube Algorithm Be Made ‘Worse’?

Radicalisation begins in echo chambers, and YouTube’s algorithm is a great architect.

Stuart Mills
Jul 23, 2019 · 5 min read

YouTube’s role in radicalising users has received much attention with the recent publication of the New York Times piece, ‘The Making of a YouTube Radical.’ Radicalisation, particularly on the far-right, has been a hot topic for a while, but it has only been recently that the role of major online platforms like YouTube been discussed.

How does YouTube radicalise people? For YouTube content creator Natalie Wynn, whose channel ContraPoints is often described as a project to de-radicalise the Alt-Right, the issue comes largely from the algorithm. YouTube’s video recommendation algorithm operates like any other algorithm, processing data through a decision tree and ranking outcomes.

Wynn argues this isn’t enough. In a recent discussion with The Hill she stated:

“The problem with having computers moderate things is computers are not very sophisticated.”

Sophistication, Wynn continues, takes the form of social intelligence, which the algorithm lacks:

“The people who are pushing far-right, white nationalist viewpoints don’t use swastikas, they don’t say the word ‘white pride. They say ‘heritage’, they say ‘migrants’ they say ‘identity’ and the people who are saying swastikas and talking about Hitler — those are generally the anti-fascist people.”

This raises the question of how the algorithm can be fixed. YouTube already employs thousands of human reviewers to deal with grievances arising from the algorithm and ensure content is suitable for advertising brackets. But with 400 hours of video uploaded every minute, and over 1 billion users since 2013, it isn’t feasible for the platform to have all content checked for radicalism — let alone recommend content to users!

Automation using the algorithm is the only way to check all this content, and to recommend videos that users might want to watch. Given Ms. Wynn’s comments, then, the solution may be to make the algorithm more sophisticated. But this, I think, is a mistake, because the algorithm is already extremely effective at what it does — it just happens to also push radicalising content.

The YouTube statistics given above reveal another story: YouTube is extremely good at attracting users and keeping them watching. Yes, the algorithm doesn’t have the social intelligence to identify right-wing dog whistles, but it is very good at recommending content for each unique user.

This is because the YouTube algorithm is a hypernudgethe combination of behavioural economics and cognitive psychology with big data and computer science. With user-specific data, the algorithm constructs a personalised recommendations tab that has seen YouTube become the biggest video platform on the internet and a challenger to traditional media, constantly nudging users to watch more.

Personalisation — the hallmark of a hypernudge — is often touted as the reason behind, ‘echo chambers,’ and, ‘filter bubbles,’ online. These environments breed radicalisation, but they are also extremely attractive from a behavioural standpoint, being familiar to users and providing attractive content at a relatively low cost. For YouTube, where the goal is to get people watching and to keep them watching, personalisation, nudging and big data are all vital components.

Embedding social intelligence into the algorithm will be difficult — extremely difficult. Advances in technology — particular in AI — might offer some hope that we can have high personalisation and social intelligence. But social intelligence only really works as a filtering system; one which tenacious characters will always try to outsmart, as we see with YouTubers gaming the algorithm today.

It is personalised recommendations which produce the echo chambers that lead to radicalisation. Tackling personalisation, then, may be more effective than attempting to filter out the content, because the filter only strangles bubbles. De-personalisation pops them. Furthermore, de-personalisation is something that could be done right now: it is easier to de-personalise something than to personalise it.

De-personalisation (i.e. making the YouTube algorithm worse) need not remove all personalisation. If a user has watched a content creator’s video, it seems reasonable to recommend another video by that creator. However, it might also be reasonable to recommend a song after watching a political video, or a video entitled, “Why X is right,” after watching, “Why X is wrong.” This would break up the congruence of message; it would pop the bubble.

In the language of behavioural economics, YouTube wouldn’t be nudging users towards content, but sludging them towards it, slowing down the spiral into radicalisation and opening escape points along the way.

De-personalising the algorithm has a political benefit too: it doesn’t start with censorship. No-platforming has been shown to work, but this will always be a controversial topic. I broadly support no-platforming, but such controversy will (whether justified or not) cloud a corporation’s decision (see the ongoing Carlos Maza situation, for example).

If we are playing Devil’s advocate, and we appreciate this difficulty, de-personalising is a worthwhile alternative.

The effectiveness of no-platforming is it makes it harder for people to access a person’s content. If they don’t have Twitter, for example, it’s a lot more effort to sign up to a new social media service. De-personalisation achieves the same thing, with users having to search out new videos rather than YouTube nudging them.

Politically, then, there are two advantages to de-personalising over a filtering approach. Firstly, YouTube doesn’t become the arbiter of what is and is not socially acceptable. Secondly, radical content producers cannot claim to be being censored (after all, free speech does not entitle one to an audience).

De-personalising the YouTube algorithm is something that can be achieved today; it’s an approach which can help pop the filter bubble; and it can be done without generating a political furore. The only thing that’s stopping YouTube is the price of change.

This, more than anything, explains the state of YouTube today. In the absence of regulation that appreciates how algorithm-driven services actually work, there is little reason for YouTube to change. After all, if people are becoming radicalised, they are also watching a lot of content, and seeing a lot of ads.

In YouTube’s relatively short life, it has only been moments of economic and reputational damage (for example, the Adpocalypse or, again, the on-going Carlos Maza situation) that have prompted change. This change has frequently been detrimental to content creators too, which is why many YouTubers are naturally reluctant to see further adjustment of the algorithm. Getting this important stakeholder group on-board is another challenge which YouTube may be reluctant to embrace.

But de-personalisation is a legitimate way forward, and if done right, can create a more vibrant YouTube experience than the gamified, siloed platform that currently exists.

And, most importantly, it can de-radicalise YouTube.

The Startup

Get smarter at building your thing. Join The Startup’s +724K followers.

Stuart Mills

Written by

Behavioural Science Fellow at the LSE. Personal Blog. twitter.com/stuart_mmills

The Startup

Get smarter at building your thing. Follow to join The Startup’s +8 million monthly readers & +724K followers.

Stuart Mills

Written by

Behavioural Science Fellow at the LSE. Personal Blog. twitter.com/stuart_mmills

The Startup

Get smarter at building your thing. Follow to join The Startup’s +8 million monthly readers & +724K followers.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store