If you live in one of the 20 million homes with a voice assistant, you’ve accepted a mixed bag of benefits and potential risks. Voice assistants can be great for simplifying daily activities like checking the weather, playing music, and setting reminders, but they’re not yet perfect devices. For starters, every command you utter to your voice assistant is recorded and stored. Plus, some of the integrations, like voice purchasing and voice-activated security, have had to go through some painful trials and tribulations. Let’s get into it.
The good and bad of recording databases
For starters, it’s important to remember that every time you ask a voice assistant for something, that command is stored and recorded. You should definitely keep that in mind moving forward, and if you’d like to learn more about managing those recordings, you should check out our guide here. One of the key issues covered in that guide is the reality that voice assistant technology isn’t entirely precise yet. Sometimes your device can misinterpret a wake-up command and start recording when it’s not supposed to. That’s even more reason to diligently manage your recordings.
An important caveat to include is the fact that these recordings can be brought forth as evidence in court, whether they work for or against you. Earlier last year, a New Hampshire judge ordered Amazon to release Echo recordings from the scene of a double murder case which could possibly unveil information leading to the killer. It is not known whether Amazon agreed to this request, but the company has complied with court proceedings in the past regarding Alexa recordings. For more information on that prior case, read this in-depth TechCrunch piece.
Voice purchasing gone bad
Ordering your favorite Amazon products without having to lift a finger may seem like an undeniable luxury, but how much power does voice purchasing really have once enabled? Last Christmas, parents purchased a surplus of items from Amazon without even realizing it as they discussed possible gift ideas for their children in preparation for the holidays. In the future, parents may need to be more cautious about what they say in the presence of an Echo Dot or Google Home instead of their own children, especially when discussing presents.
There have been multiple instances regarding how children are using their family voice assistants and what purchasing abilities they have. Two years ago, a 6-year-old Texas girl accidentally ordered a $160 dollhouse and a 4 pound tin of cookies while having a casual conversation with her beloved Alexa. The confused parents looked into the recordings and found the exact conversation their daughter had with Alexa where she mentioned a dollhouse she wanted. The person doesn’t even have to be in the room to activate your voice assistant. Radio personality Jim Patton said, “Alexa order me a dollhouse” while on the air in 2017 and triggered the purchase of hundreds of doll houses for listeners who had Echoes nearby.
You can avoid these issues by setting up a password for voice purchasing now, or by disabling the feature altogether. But these aren’t the default security settings, so make sure you put the proper protections in place. Read more about how to further protect purchase security here.
The reality of voice-assisted security
With a few new integrations, Alexa can now be used as a home security device. Alexa’s “guard mode” allows the device to turn into a glass-breaking detector. When you say “Alexa, I’m leaving”, it will automatically start listening for sounds of glass breaking. Once that noise is detected, the homeowner will be quickly notified. While this element of voice-assisted security, in particular, is helpful, there’s another feature that could seriously put your home in danger: Bluetooth enabled door locks.
In 2016, a Missouri resident was installing his new Apple HomeKit system which allowed him to control his door locks, light switches and thermostats all from his iPad. The new technology seemed to ease his worries until a neighbor jokingly walked up to his front porch and shouted: “Hey Siri, unlock the door!” The frightening truth revealed itself as the door unlocked and the neighbor walked on inside. Security features on voice assistants have since been improved, requiring 4 to 6 digit pin codes to enable the feature. Having to yell out the secure passcode of your entire home security system, however, doesn’t seem all that safe either. These integrations are works in progress, and they should be treated as such. Until they are steadfast, they shouldn’t be your only line of defense.
Data party with third-party companies
The companies behind these powerful voice assistants have clear transparency agreements that break down each aspect of the devices that may be an area of concern for users. Carefully examine the privacy policies of your listening devices to ensure you’re fully aware of what permissions you accept, what they mean, and how to consistently regulate them. In general, we don’t think voice-activated assistants should be seen as something of which to be afraid. But in our world of data collection, it’s important to know how your home device sends your personal information out to the world. And it’s even more important to use any and all integrations cautiously at first, as users are likely to discover problems along the way.
By: Hailey Karns