Francesca Duchi
Apr 16, 2019 · 4 min read
Image for post
Image for post

Problematic Algorithms: Youtube’s censorship and demonetization problem

On YouTube, the generation of advertising revenue is accelerated by the presence of YouTubers who produce digital video content that can attract millions of followers (Langley and Leyshon, 2016, p. 13).

Since 2012, it is possible for YouTube content to be flagged as “inappropriate” and be consequently demonetized, i.e. the creators cannot profit from the revenue derived from advertising on said videos. Creators were not informed of the demonetization by the platform until 2016. As this practice emerged, a conversation was sparked regarding YouTube’s definition of “inappropriate”, censorship, and the rights of content-makers on the platform.

This demonetization occurs as YouTube demands that monetized content to be “advertiser-friendly”. The following are the guidelines that YouTube provides for “advertiser-friendly” content:

Image for post
Image for post

Most of these guidelines are vague. It is hard to make clear-cut decisions when it comes to descriptors such as “inappropriate language”, or the even less clear “controversial or sensitive subjects”. In fact, it is arguably impossible to provide completely bias free judgements when it comes to such terminology.

Youtube is employing algorithms, to identify videos that fit such descriptions. However, algorithms can be inaccurate. If this is paired with the broadness of the guidelines, the result is very likely to be unfair and controversial demonetization.

While the creators affected were many, one particularly salient example is constituted by LGBTQIA YouTubers. Coming out stories or informative videos on non-heterosexual relationships were identified by the algorithm as not advertiser-friendly. On 14 September 2017, author and comedian Gaby Dunn shared with Twitter that all LGBTQIA content on her channel had been demonetized:

Moreover, she noticed that videos displaying “heterosexual content” were not flagged as inappropriate for advertising:

Clearly, the flagging algorithm is designed with a heteronormative bias, which tends to consider heterosexual content as more appropriate and therefore acceptable for advertisers.

YouTube wrote posts in April and May 2017 addressing the issue. They admitted that their algorithms make mistakes “in understanding context and nuances”; that they “should not filter out content belonging to individuals or groups based on certain attributes like gender, gender identity, race, religion or sexual orientation”. They also promised to fix an engineering “issue”, effectively blaming the algorithm itself. By selectively demonetizing some videos, YouTube is making it clear that the platform does not treat all content and creators equally, and that its loyalty lies with the advertisers, which constitute the main source of revenue for the platform.

The videos, once demonetized, are not removed, and are therefore not censored entirely from the website. However, if the creators do not receive revenue from videos, they are strongly discouraged from making content, and at times forced to abandon topics that are at risk of being flagged for being inappropriate. Any creator discussing a topic which a big advertiser might find “dangerous” or “inappropriate” is actively discouraged from doing so. In this sense, YouTube is effectively censoring content.

Besides highlighting the possibility for bias and discrimination to be built into algorithms, this issue also raises questions on digital labour, and on the role of content-makers on YouTube as labourers. In fact, the online activity of such individuals is situated on the blurred boundary between “labour” and “play” (Kücklich, 2005). In fact, many YouTubers began their channels as a hobby or distraction. However, making YouTube video has become a full-time profession for some, and a great source of revenue for YouTube. However, through demonetization, Youtube is censoring Youtubers’ work after it gained the audience’s approval and contributed to Youtube’s revenue, and effectively pushing creators away from certain topics. This creates a complex picture of the labour and power relationships between creator, platform, advertisers and audience.

Finally, the demonetization of select videos further disproves the idea that the Internet is an entirely democratic place which disrupts existing top-to-bottom power relations, in favour of a horizontal model (Pasquale, 2010, p.312). In fact, it is worth noting that the audience has virtually no power in determining which videos are demonetized or not. While a significant audience is needed for a video to become monetized in the first place, it is then the advertising companies, and YouTube working for them, that decide which videos will remain a source of revenue.

Kuchlich, J. (2005). Precarious Playbor: Modders and the Digital Games Industry.The Fibreculture Journal 5(25).

Langley, P., & Leyshon, A. (2017). Platform capitalism: the intermediation and capitalisation of
digital economic circulation. Finance and society., 3(1), 11–31.

Pasquale, F. (2010). Two Narratives of Platform Capitalism, Yale Law & Policy Review: 35 (1),309- 319.

Future Vision

A publication centered around high quality storytelling

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store