BIGtoken Team
Mar 28 · 5 min read

If you live in one of the 20 million homes with a voice assistant, you’ve accepted a mixed bag of benefits and potential risks. Voice assistants can be great for simplifying daily activities like checking the weather, playing music, and setting reminders, but they’re not yet perfect devices. For starters, every command you utter to your voice assistant is recorded and stored. Plus, some of the integrations, like voice purchasing and voice-activated security, have had to go through some painful trials and tribulations. Let’s get into it.

The good and bad of recording databases

For starters, it’s important to remember that every time you ask a voice assistant for something, that command is stored and recorded. You should definitely keep that in mind moving forward, and if you’d like to learn more about managing those recordings, you should check out our guide here. One of the key issues covered in that guide is the reality that voice assistant technology isn’t entirely precise yet. Sometimes your device can misinterpret a wake-up command and start recording when it’s not supposed to. That’s even more reason to diligently manage your recordings.

An important caveat to include is the fact that these recordings can be brought forth as evidence in court, whether they work for or against you. Earlier last year, a New Hampshire judge ordered Amazon to release Echo recordings from the scene of a double murder case which could possibly unveil information leading to the killer. It is not known whether Amazon agreed to this request, but the company has complied with court proceedings in the past regarding Alexa recordings. For more information on that prior case, read this in-depth TechCrunch piece.

Voice purchasing gone bad

Ordering your favorite Amazon products without having to lift a finger may seem like an undeniable luxury, but how much power does voice purchasing really have once enabled? Last Christmas, parents purchased a surplus of items from Amazon without even realizing it as they discussed possible gift ideas for their children in preparation for the holidays. In the future, parents may need to be more cautious about what they say in the presence of an Echo Dot or Google Home instead of their own children, especially when discussing presents.

There have been multiple instances regarding how children are using their family voice assistants and what purchasing abilities they have. Two years ago, a 6-year-old Texas girl accidentally ordered a $160 dollhouse and a 4 pound tin of cookies while having a casual conversation with her beloved Alexa. The confused parents looked into the recordings and found the exact conversation their daughter had with Alexa where she mentioned a dollhouse she wanted. The person doesn’t even have to be in the room to activate your voice assistant. Radio personality Jim Patton said, “Alexa order me a dollhouse” while on the air in 2017 and triggered the purchase of hundreds of doll houses for listeners who had Echoes nearby.

You can avoid these issues by setting up a password for voice purchasing now, or by disabling the feature altogether. But these aren’t the default security settings, so make sure you put the proper protections in place. Read more about how to further protect purchase security here.

Via Pexels.

The reality of voice-assisted security

With a few new integrations, Alexa can now be used as a home security device. Alexa’s “guard mode” allows the device to turn into a glass-breaking detector. When you say “Alexa, I’m leaving”, it will automatically start listening for sounds of glass breaking. Once that noise is detected, the homeowner will be quickly notified. While this element of voice-assisted security, in particular, is helpful, there’s another feature that could seriously put your home in danger: Bluetooth enabled door locks.

In 2016, a Missouri resident was installing his new Apple HomeKit system which allowed him to control his door locks, light switches and thermostats all from his iPad. The new technology seemed to ease his worries until a neighbor jokingly walked up to his front porch and shouted: “Hey Siri, unlock the door!” The frightening truth revealed itself as the door unlocked and the neighbor walked on inside. Security features on voice assistants have since been improved, requiring 4 to 6 digit pin codes to enable the feature. Having to yell out the secure passcode of your entire home security system, however, doesn’t seem all that safe either. These integrations are works in progress, and they should be treated as such. Until they are steadfast, they shouldn’t be your only line of defense.

Via Twenty20.

Data party with third-party companies

This last point is an important one: Third-party accessibility is on YOU, according to Amazon’s Privacy Policy. Amazon has no responsibility or liability for the third-party companies that have partnered with Alexa. It is your responsibility to know which third-party features are enabled. Understand that for example, your zip code may be exchanged with The Weather Channel when you ask about the weather. Companies such as Spotify, Uber, Nest, Kayak, Capital One Bank, Belkin, Ring, and Sonos have jumped at the opportunity to integrate their services into the popular voice assistant devices. While these integrations are helpful, make sure you understand how your data is spread and exchanged in turn. Explore the full list of possible third-party integrations here. Be aware of which third-party corporations are enabled on your home device and keep up with who you permit access to.

The companies behind these powerful voice assistants have clear transparency agreements that break down each aspect of the devices that may be an area of concern for users. Carefully examine the privacy policies of your listening devices to ensure you’re fully aware of what permissions you accept, what they mean, and how to consistently regulate them. In general, we don’t think voice-activated assistants should be seen as something of which to be afraid. But in our world of data collection, it’s important to know how your home device sends your personal information out to the world. And it’s even more important to use any and all integrations cautiously at first, as users are likely to discover problems along the way.

By: Hailey Karns

BIGtoken

The blockchain platform for consumers to own, verify, and sell their data.

BIGtoken Team

Written by

The blockchain platform for consumers to own, verify, and sell their data. https://bigtoken.app.link/mediumdownload

BIGtoken

BIGtoken

The blockchain platform for consumers to own, verify, and sell their data.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade