Trying to understand YouTube’s recommendation system

We can see a sharp decline in recommendations for ‘alternative influence’ channels.

A bad month for YouTube

“recommendations of borderline content and content that could misinform users in harmful ways — such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”

“Social media platforms not only host this troubling content, they end up recommending it to the people most vulnerable to it.” — Ysabel Gerrard and Tarleton Gillespie

YouTube under pressure from users and advertisers

A search for ‘feminism’ returns a video of Emma Watson, with advertising from Oral B, and an ‘Up next’ recommendation for an anti-feminist video from an alt-right channel.

The problem of personalization

With the dawn of social media, information — and misinformation — about vaccines can spread further and faster than ever before and one of the findings of this report is that this may, unfortunately, be advantageous for anti-vaccination groups.

In absolute terms, YouTube does not often recommend videos from anti-vax channels to anonymous users.

“The problem is, YouTube’s recommendation algorithm has been trained over the years to give users more of what it thinks they want. So if a user happens to watch a lot of far-right conspiracy theories, the algorithm is likely to lead them down a dark path to even more of them.” — Issie Lapowsky.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store