Zoom asked to Screen Share Compliance with Children’s Privacy Laws

Sara Hobe
The Startup
Published in
4 min readApr 3, 2020

Spotlight on the Company’s Privacy Practices Just Reveals Larger Issues for Children’s Privacy Online

Earlier in March, schools around the U.S. began shutting down in response to the Coronavirus pandemic. This shelter-in-place mandate left many educators scrambling to find ways to keep their students on track in the absence of in-person instruction. Many turned to free video conferencing software, and Zoom CEO Eric Yuan stepped up to the plate, setting up schools with free accounts. Zoom’s functionality allowed teachers to hold classes remotely, and there was no pesky 40-minute time limit for teachers and students registered with their school email addresses. Students affected by quarantines were now able to gain a little bit more structure in their otherwise amorphous, home-bound days, and in turn parents were all too grateful. As one commenter (in the above article) writes, apparently a teacher at a public school for deaf children, “Thank you, thank you, thank you Eric Yuan!”

Yet it would seem that no good deed goes unpunished: two weeks later, the company received a letter from the Attorney General of New York which expressed concerns about the company’s data privacy and information security practices according to a New York Times article. The NY Attorney General’s office requested information regarding Zoom’s measures against proliferating hacker attempts, dubbed “Zoombombing”, to which the company has evidently already replied in a statement. But Zoom’s security was not the only issue under scrutiny. The letter also requests information about how the company is handling children’s personal data in light of the large number of minors now on the platform.

A whipping boy for larger industry issues.

The video conferencing product was originally intended for businesses, not children, but now it suddenly finds itself up against a number of children’s privacy laws. By contrast, most dedicated K-12 online learning platforms have, at minimum, dealt with federal regulations regarding children’s data from their very inception; one law (the Children’s Online Privacy Protection Act, “COPPA”) limits the collection and use of the personal data of minors under 13, such as names and email addresses, and its updated version even includes online identifiers. Also at issue for many companies aimed at this K-12 set are FERPA, a federal law regarding student data, and California’s own SOPIPA.

And these aren’t the only sheriffs in town protecting children’s data. The beginning of this year saw the commencement of California’s data privacy law, “CCPA”, which places additional burdens on businesses to obtain consent from minors to collect their personal information, or in the case of those under 13, to obtain ‘verifiable parental consent.’ The regulation is very fresh, and companies big and small are still in the process of shoring up their compliance stance — even companies that do in fact directly market to children.

It’s difficult to imagine how a video conferencing company might adequately address all of these laws, especially during this massive surge in use of its product. Yet, however troubling this may be, we should also remember that this issue is far from unusual, and certainly not new to this particularly sensitive realm of data privacy. Now in the limelight, Zoom is serving as a whipping boy for larger industry issues.

Now that so many students can’t go to school, they are turning more than ever to online platforms to supplement their education.

The Coronavirus pandemic has brought a number of data privacy topics to the fore: from the appropriate use of personal health data to trace infection, to the management and security of employee data in the wake of work-from-home mandates where the lines between business and personal are blurred to an extreme.

Now that so many students can’t go to school, they are turning more than ever to online platforms to supplement their education, or just to have fun on their phone while stuck at home. Children’s online privacy in the era of edtech is only becoming more relevant, and it’s becoming increasingly difficult to ignore the fact that children use countless online services not specifically aimed at children. Right now, companies can make the argument that the laws don’t apply to them because children aren’t their intended audience, claiming ignorance about whether any children are using their services, regardless of the reality (the crux of the FTC’s ongoing case against YouTube/Google).

These businesses have a responsibility to make sure that they are protecting children’s data, regardless of who the original target audience of their product is. Hopefully, the publicity that Zoom is disproportionately getting right now will encourage other companies to consider what to do when children unexpectedly start using their services.

--

--

Sara Hobe
The Startup

Data privacy specialist (CIPP/E) and doctor of philosophy located in San Francisco. Interested in contextualizing current events in human history.