Data Protection
Published in

Data Protection

Instagram for Under-13s: Facebook’s “Fight” With Child Privacy Law?

Facebook is planning a social media network for young kids. Will it succeed? Probably.

Photo by Prateek Katyal on Unsplash

Mark Zuckerberg once said that the Children’s Online Privacy Protection Act (COPPA) was a “fight” Facebook would “take on at some point.”

This week, we learned Facebook is planning an Instagram for under-13s. If this is the fight, Facebook will probably win.

Why? Why would Facebook do this?

Instagram is currently unavailable for under-13s because they have special legal protections, meaning that it’s harder for businesses to collect their data and to target them with ads.

In the U.S., the main children’s privacy law is COPPA, a federal law passed in 2000 — before Facebook, YouTube, or Instagram existed.

Since 2013, COPPA has required websites to get parental consent before tracking under-13s with cookies.

Does COPPA have teeth?

Some COPPA settlements seem big at first — like the Federal Trade Commission’s $170 million settlement with Google in 2019. But bear in mind that Google’s turnover was $160 billion that year.

Google allegedly tracked users under 13 on YouTube. After the settlement, YouTube basically shifted COPPA liability to content creators (among other measures)

TikTok (previously also settled for $5.7 million with the FTC under COPPA in 2019.

Whether carelessly or willingly, apps and sites are repeatedly allowing kids to sign up without getting parental consent.

What about outside the U.S.?

Outside the U.S., things could be even more complicated for Facebook’s new venture.

I covered two ongoing U.K. child privacy cases last year alleging platforms had violated the GDPR’s child privacy rules.

The first was against YouTube, aiming for an ambitious $2.5 billion in damages. This had to do with how YouTube processes kids’ data to make recommendations. The second, which is at a very early stage, was against TikTok, led by a 12-year-old girl.

Recently, we’ve seen some EU DPAs coming down hard on social media apps under the GDPR’s child privacy rules. See Italy’s recent action against TikTok, for example.

How will Facebook avoid problems like this?

Setting up a separate platform for youngsters might be a way for Facebook to avoid claims under child privacy law.

For example, Tiny Instagram could require parental consent at sign-up. Or it might avoid collecting certain types of data. Or it could — imagine this — avoid targeting ads based on users’ personal information.

Or, Facebook could just take the risk, pay any resulting fines, and work out some way of complying with enforcement notices without failing to profit.

Cases like those explored above could be a headache for Facebook. But I think it’ll be fine — it always is.



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store