Alexa, Are You Listening to Me?
Educating people on what voice devices mean for their privacy
“But what if it’s always listening to me?” is an all too common question asked about voice technology. For those in the voice industry, it may become exhausting to hear this from consumers skeptical of voice assistants and smart devices.
With the addition of visual interfaces to this technology on the rise (e.g. Echo Show), people are understandably concerned with each new sensor, camera or microphone added to their products. To what extent is this technological development an invasion of our privacy?
To answer this question, it is important to understand the constraints of voice technology. Also, we must realize why consumers are concerned. Based on research and personal experience, the most common questions users have about voice devices are:
- To what extent does a device listen to a person’s conversations?
- How is the data used?
- Will my data be stolen?
- How to protect vulnerable individuals?
Your Device Is Listening, Sometimes
Nobody likes an eavesdropper, especially one that you have invited into your home. Research conducted by Northeastern University found that smart speakers have the tendency to “wake up” at the wrong times.
After playing 100+ hours of audio from popular TV shows to simulate constant conversation, a false positive was triggered, meaning the device would accidentally “wake up,” nearly once per hour. However, the activations were typically under 10 seconds and ended with devices asking the user to repeat for confirmation.
When trying to replicate these false positives, the devices rarely made the same mistake twice. This indicates that while the device has its imperfections (as does much of emerging technology), it has the ability to learn from its mistakes.
Understandably, the concept of machine learning makes consumers wary of what and how devices are learning about our behavior, but at its core, machine learning is meant to reduce errors and make predictions better than humans could do manually.
The consensus is, yes, your device may be listening to you at inopportune times. However, there are several privacy settings that can be used. Similar to when navigating through websites, a user has the option to create audible alerts to know when voice recordings begin, delete recording history, and altogether mute the device.
Opt In To Security
As the Internet of Things (IOT) transforms how consumers integrate technology into their everyday lives, users must be aware of how smart devices capture and use their information. The proliferation of this technology has made people wary of surveillance, and voice devices are one way people feel “monitored.”
Typically data collected by these devices may be sold to advertisers, who then will then display targeted ads on channels that the users are visiting. As a rule of thumb, if a product is free, there is a chance that you are the product.
The conversation surrounding how our online data is used goes far beyond privacy on smart devices. It concerns the companies that control your information. With concerns of large-scale data breaches and phishing attacks, there is a push towards opt-in rather than opt-out defaults.
Personal data are usually collected automatically when visiting websites unless the individual opts out of certain settings. The practice of opting out is of course easier said than done, with many companies requiring you to sift through terms & conditions policies.
There is also the question of how to find which companies already have data on you that you cannot take back. While GDPR in Europe (the data protection law of the land) outlines standards for how to opt in user consent, such standards are yet to be standardized in the United States.
Create Your Safe Environment
For those concerned about voice assistants being hacked, several ways exist to add an extra layer of security. Many of the tactics recommended by Norton are similar to security on a desktop or phone.
This includes strengthening account passwords, turning off purchasing options (especially if you have your device in a shared space), using a secure Wifi network and enabling voice recognition.
These practices can be applied similarly to mobile devices, which arguably are more integrated to one’s daily routine than smart speakers due to their location-tracking capabilities.
When it comes to health information, devices must comply with HIPAA, which provides added security. The nuances are based on where these devices are used.
Many platforms are not yet HIPAA compliant (although Amazon Alexa is) and education needs to be provided to providers and patients on its limitations.
For example, if a doctor is on the go and wants to hear a recorded briefing on a patient’s status, they need to be in a private space. For patients, when following up on medications through their home smart speaker, the system should verify the patient is alone.
Design for All Users
With millions of people using smart speakers and voice assistants, the notion of “an average user” becomes tenuous. For parents, this is an important discussion to have when introducing children to new devices. COPPA, the Children’s Online Privacy Protection Act, prevents companies from collecting data on children under age 13 without parental consent.
However, without separate profiles on an Amazon Echo, for example, a child’s interactions could still be tracked. While companies are still navigating the best way to enhance children’s privacy, many of the same strategies for securing anyone’s online identity can be done for children.
Many scams have historically targeted elderly people. Robocalls with the latest conversational AI technology are responsible for many of these scams, and the use of these could extend to other voice first devices.
“Vishing” refers to the use of voice communication to scam, and educating elderly people on how to spot these tactics is a good first step. Financial institutions are also adding speech recognition technology to verify sensitive phone transactions to enhance security.
Companies will need to educate how devices with voice technology can protect privacy and can be used safely. Alexa and other devices may be listening, but that should be a good thing, not a bad thing, for consumers.
Like this article? Let’s connect on LinkedIn or chat over virtual coffee
Originally published at https://minutehack.com.