Maybe Don’t Buy Toys That Record Everything You Say

Think of all the toys that DON’T record everything!

My Friend Cayla commercial. This doll is clearly plotting something.

I’ve started using the “Hey, Siri” feature on my phone, which I love, I can just stand in the middle of the room and say “Hey Siri, how cold is it outside?” and assuming my phone is plugged in, Siri will respond. (Why can’t Siri respond when my phone isn’t plugged in? Answer me that.)

But I’ve obviously started wondering whether Siri listens to all the stuff I say that isn’t “Hey Siri,” because she has to, right? Like, if I decided to sing “Hey Jude,” would Siri be all like oh, this might be for me, and then be disappointed when it wasn’t?

And part of me accepts that you just have to trust the creepy future, that my phone is going to listen in on everything I do and maybe it won’t be that bad, and then I go visit Consumerist and read this:

Consumerist’s Kate Cox does a deep-dive on two interactive children’s toys: My Friend Cayla and the i-Que Intelligent Robot. Both toys listen and record children’s speech and use that information to have simple conversations—but the toys themselves are anything but simple.

When users first set up the app for their toy, they may be sharing data you don’t want shared. Cayla in particular asks for multiple pieces of personal information — the child’s name, their parents’ names, their school name, their hometown, among other questions — so it can converse more naturally. The app also allows for location setting, and both the Cayla and i-Que apps collect users’ IP addresses.
So far this is pretty straightforward. The Terms of Service for both toys say that they collect data in order to improve the way the toys work, and for “other services and products.”
Researchers studied the way the toys work, the complaint continues, and it turns out that they send audio files to a third party: Nuance Communication’s servers at the company’s headquarters in Massachusetts.

Why would a doll want to know a kid’s parent’s names? A kid’s parent’s names are some variation of “Mom” and/or “Dad,” unless the parent has gotten on the whole “Papa” trend.

But aside from having all of the information necessary to answer the security questions on a child’s future bank account, these toys may also be using children’s voices for their own purposes.

And here’s where it starts to get more complicated: both toys are also governed under Nuance’s general privacy policy, which says, “We may use the information that we collect for our internal purposes to develop, tune, enhance, and improve our products and services, and for advertising and marketing consistent with this Privacy Policy.”
Then the third parties come in: because of the other work Nuance Communications does, some of the 30 million voice prints it claims to have access to — for the purpose of enhancing its ability to parse and analyze audio files on behalf of law enforcement — may well be generated by eavesdropping dolls.

This is where I’m going to advise you to read the whole piece, because it’s both detailed and really unnerving, and I haven’t even gotten to the part where the dolls can potentially be hacked, the same way hackers used thermostats to bring down Twitter last month.

So… if you’re doing your holiday shopping, maybe don’t buy these toys? There are lots of other toys. Plenty of them don’t contain recording devices!

I’ll end with these toys’ respective commercials, which are both terrifying and both include the same tagline: [Toy Name] knows millions of things. Including everything about you.