When YouTube took down my video

Marietje.Schaake
3 min readFeb 13, 2019

--

7 October 2016

There was reason to celebrate last week: the European Parliament adopted my report to update the EU’s export control laws for goods that are used for torture and the death penalty. I uploaded a series of videos on YouTube of the debate in the Parliament. I do what I can to make my work easily accessible.Everyone who has ever tried to search for a video of a specific intervention on the European Parliament website knows that it is not the most user-friendly website on the web.

YouTube is now one of the most used platforms for videos online. It has over a billion users, and reaches more 18–49 year-olds than any cable network in the U.S. It has become a portal for sharing information and ideas, which is vital for any democratic society.

Three hours later I found a notice in my inbox, which explained that one of the video’s in this playlist had been taken down.

The removed video included footage of European Trade Commissioner Cecila Malmstrom, who gave her opinion about the new law. There is hardly a clearer example of political speech than this video.

I was mystified about the notice. How could this video violate YouTube’s community guidelines? Was it the mere reference of the word ‘torture’?

I simply received a link to the very broad YouTube Community Guidelines.

Did the video contain harmful or dangerous content? Violent content? Nudity? Hateful content? Threats? Spam? You can judge for yourself. Yet the notice I received stated that the video was ‘flagged for review’, and that ‘upon review’, YouTube ‘determined that it violated our guidelines’. This formulation suggests that a person rather than an algorithm actually watched the video and still found a violation of the guidelines.

I am still waiting for details from Google about what actually happened. They did reach out to me promising to get to the bottom of the matter. I am pretty sure most users do not get a call from the company after they tweet about a take down.

Whether the video was removed automatically or if somebody has actually watched the video and decided it violates YouTube’s community guidelines, there is something wrong with YouTube’s way of content moderation. Imagine the impact broad criteria have on content filtering worldwide.

I objected against the removal. Without knowing why the post was removed, I still had to argue (in one sentence!) why the video needed to stay up, which is slightly Kafkaesque. Four hours later the video was back online.

At noon on the next day, Google apologized publicly.

As a member of the European Parliament with a significant online following I am a public figure. Less privileged users would likely not receive the same treatment, and would find it harder to find a point of contact at Google — or any large platform for that matter — to whom they can speak. I would like to hear your experiences!

On a fundamental level this take down is just one example in a trend towards the automated removal of illegal or undesired content by large platforms, which essentially happens in a due process vacuum. It is time to change this.

I had already planned a hearing on ‘Algorithmic accountability and transparency in the digital economy’ at the European Parliament on the 7th of November. This week’s experience was a clearer reminder than I could have imagined of why we need to address the role of algorithms, free speech, and access to information. This discussion fits into my broader agenda of giving meaning to the rule of law online. You can register here to attend the seminar.

EDIT: On the 10th of November I received the following response

--

--

Marietje.Schaake

International Director of Policy, Stanford’s Cyber Policy Center, International Policy Fellow, Institute for Human-Centered AI. President, CyberPeace Institute