COPPA, Youtube advertising and the future of online content creation Part 1

Why the COPPA situation tells us we need to radically rethink how we consume and support our favourite online creators

Alex "Scrapper"
The Mechworks
2 min readNov 24, 2019

--

What’s going on?

The Children’s Online Privacy Protection Act (COPPA) is a US law that came into effect in 2000 intended to protect children when accessing the internet. The reason this nearly two decades old law is being discussed right now is recent concerns raised by the US Federal Trade Commission (FTC) about Youtube’s compliance with this law. The issue is that content targeted at children cannot have personalised ads run against it, because to do this Youtube to tracks the activity of children online and creates a profile based on their interests. This is a major issue as personalised ads are the most effective and profitable form of advertising. This affects both those that create child friendly content, who will lose a significant percentage of their revenue; as well as those that make content that appeals to children but may not be child friendly (particularly gaming content), who may be unfairly labelled as child friendly, or be fined for falsely labelling their content as such.

For an explainer on the changes coming to YouTube in response to this check out The YouTube help article here.

I believe this current issue as part of a wider problem with how people consume content on YouTube. In short the dependency on advertising revenue for creators, Has led to the situation where children using the platform were being profiled in order to receive targeted advertising to make YouTube money.

The targeted advertising model can also be seen as the root cause of other issues on YouTube’s platform.

The personalised Interest profiles that YouTube automatically created for each user, are also used to serve them personalised recommendations for new videos to watch. The problem is that unscrupulous creators can create videos that are similar to popular content, in order to get people to watch content they did not originally intend to. Once a person has accessed harmful content in this way, the recommendation algorithm will treat this as one of their interests and continue to show them similar content, leading to a snowball effect. This has been shown to happen with multiple types of of potentially dangerous content, including Far Right propaganda and videos featuring violent and sexual content targeted at children.

This is not a bug in the Youtube system, it is a feature. The need for effective advertising leads to the need to personalise ads that people receive, which leads to this kind of data collection, which is the source of the issues detailed above.

--

--