Privacy and security defects in daycare apps let Big Brother be a Peeping Tom
In an era of rampant discord on just about everything, one of the few things people do seem to agree on is that young children ought to be kept safe. Everybody wants that, or at least says they do.
But as usual, there’s saying and then there’s doing. And according to a couple of recent investigations, there isn’t nearly enough being done to keep children who attend daycare and preschool safe from online threats.
Two groups — several parents of young children at the Electronic Frontier Foundation (EFF), and researchers in Germany from the AWARE7 agency, Institute for Internet Security, Max Planck Institute for Security and Privacy, and Ruhr University Bochum — both reported recently that applications designed to involve parents with their children’s care are alarmingly lacking in privacy and security protections.
Those apps, which many daycare centers and preschools require parents to use, can provide notifications of feedings, diaper changes, pictures, activities, and which guardian picked-up/dropped-off the child, all of which the EFF called “potentially useful features for overcoming separation anxiety of newly enrolled children and their anxious parents.”
But when Alexis Hancock, director of engineering for Certbot at the EFF and a parent of young children herself, asked if the apps were secure, “the answer was a resounding No,” she reported. With a bit more research, she also discovered that most don’t even offer two-factor authentication (2FA).
One of the most popular she investigated, Brightwheel, didn’t even have an email address for parents to report security concerns. She was eventually able to contact the company, and noted that Brightwheel has rolled out 2FA this year, posting an announcement about it on YouTube on June 28.
But the declaration that they were the “first partner in the early education industry to add this extra level of security,” while welcome news to Hancock, was “also potentially worrisome.”
Indeed, if they’re the first to offer 2FA, that would mean none of the others are doing it. And Hancock reported that she found other popular daycare and early education apps lacking in more than 2FA.
“Through static and dynamic analysis of several apps, we uncovered not just security issues but privacy-compromising features as well,” she wrote. Those include “weak password policies, Facebook tracking, cleartext traffic enabled, and vectors for malicious apps to view sensitive data.”
Even more worrisome, when she pointed out security weaknesses to the app vendors, rather than a grateful response and vows to fix them, she “received little to no response.”
The same was true for the German researchers, who reported that they used static and dynamic analysis, configuration scanners, and various apps’ privacy policies to analyze the “privacy and security of 42 Android childcare applications and their cloud-backends.”
Their findings were just as alarming as those from the EFF. “[M]any third-party (tracking) services are embedded in the applications and adversaries can access personal data by abusing vulnerabilities in the applications,” according to the researchers.
Yet only six of the vendors responded to their findings.
And apparently, so far they don’t really have to, even though it would seem likely that, at least in the U.S., that kind of collection and sharing of data about young children would be illegal.
The Children’s Online Privacy Protection Act (COPPA), which has been around since 1998, imposes privacy requirements on “operators of other websites or online services that have actual knowledge that they are collecting personal information online from a child under 13 years of age.”
So even if children are only 3 or 4 years old and aren’t aware that they’re providing data to an application, the operators of the apps surely know it. As the German researchers put it, while the children don’t use them directly, the apps “store and process much of their data.”
Those researchers also found that “childcare management apps generally request several dangerous permissions, which is even higher when third-party (tracking) libraries are used. The analyzed apps share user data, user interaction, device data, and app data with third-party services … Moreover, we found that the developers of the apps do not even mention the use of such third-party tracking services in their privacy policies.”
The Federal Trade Commission (FTC), the agency that enforces the Act, has the text of COPPA posted within its website. It says the privacy and data collection requirements in the Act apply to “any operator that has actual knowledge that it is collecting or maintaining personal information from a child [younger than 13].”
Those requirements include notification on its website about what data it collects and how it uses, stores, and shares it; getting verifiable parental consent for the collection and use of the data; and protecting the confidentiality of the data.
Juliana Gruenwald of the FTC’s Office of Public Affairs, said the agency “cannot comment on specific companies or practices,” but that the FTC has “brought more than 30 COPPA-related enforcement actions.”
Based on the FTC’s 2022 report to Congress, it doesn’t look like any of 11 actions brought in the past five years involve vendors of the apps analyzed by the EFF and the group of German researchers.
Most involve consent agreements with companies that make gaming or toy-related apps for children, although two of the biggest fines, for data collection violations, were against YouTube and Google ($170 million) and TikTok ($5.7 million).
Vulnerabilities affecting the vulnerable
This doesn’t make childcare apps worse than most other apps on the market when it comes to defects in privacy and security. Software is never, and never will be, perfect. Sammy Migues, chief scientist with the Synopsys Software Integrity Group, invoked a sardonic quote from 1975 attributed to the late Gerald Weinberg, then a computer scientist at the University of Nebraska, which is now known as Weinberg’s Law: “If builders built buildings the way programmers write programs, the first woodpecker that came along would destroy civilization.”
Still, the fact that young children are involved means those defects affect one of the country’s most vulnerable demographics. And just because software isn’t perfect doesn’t mean it can’t be made much closer to perfect. There are multiple basic security and privacy measures app vendors could implement to keep themselves out of the category security experts call “low-hanging fruit” for cyberattackers.
The EFF recommends
- Make 2FA available for all administrators and staff
- Fix known security vulnerabilities in mobile applications
- Disclose and list any trackers and analytics and how they are used
- Use hardened cloud server images and establish a process to continuously update out-of-date technology on those servers
- Lock down any public cloud buckets hosting children’s videos and photos. These should not be publicly available
- Use end-to-end encrypted messaging between school and parents
- Create security channels for reporting vulnerabilities
Debrup Ghosh, senior product manager with the Synopsys Software Integrity Group, called 2FA a “bare minimum for security. This should be a must-have in your feature set today.”
He said that should include passwordless authentication with one-time-use code links that are both convenient and much more secure.
He also suggested that daycare and early education centers do a bit of security crowd-sourcing. “I recommend they reach out to the community and ask for volunteers to help evaluate these apps,” he said.
And for parents and other adults involved in using the apps, “make sure you understand and exercise your rights either with COPPA or [in California] the California Consumer Privacy Act. They offer a lot of data protection and prevent your data from being sold,” he said.
To that advice, the German researchers add that there is an obvious need for better transparency between the vendors and their customers/users.
“Our results show that the privacy policies [of the analyzed apps] are unclear about the data collection practices, do not state how the companies protect the children’s data, and underreport on the scope of data shared with third-party services,” they wrote.
“Concerningly, some of the tested applications relied on misconfigured cloud storage that allowed anyone to access and download data ranging from all children’s activities, over messages, to personal photos.”
There oughta be a law
What are the chances of any of this increasing parental awareness leading to substantive change? Michael White, applications engineer with the Synopsys Software Integrity Group, said it could come from more government involvement.
A section on product labeling in President Joe Biden’s May 2021 executive order on cyber security and a proposed law on product security in the U.K. would require “retailers, wholesalers, importers, and manufacturers to be transparent about what their products contain and where they originate. In the case of the proposed U.K. regulations in particular, there are also financial penalties for retailers and importers who fail to comply with the rules,” he said.
It’s hard to imagine much opposition to any of that. As Alvaro Martin Bedoya, who became FTC commissioner earlier this year, put it in a statement on COPPA policy, “Kids have a right to learn in privacy.”