Growing Privacy Concerns with Voice-enabled Devices

Shyamala Prayaga
Digital Assistant Academy
6 min readNov 29, 2020
TOKYO, JAPAN — JULY 31ST, 2018. Amazon Echo Spot on wooden background. — Photo by akulamatiau
Personalized Echo Device

Across 2020, adoption of voice technology went up by 48%, according to Statista.

For all that growth, general usage of voice assistants hasn’t changed much. The majority of use cases are limited to music, weather, home controls, and other simple commands.

That’s odd, because the technology is becoming increasingly capable of complex tasks and serving users in greater ways.

Man with a voice enabled fridge

Most consumers know this. But the reason why they are hesitant to adopt some of those new features boils down to one thing: privacy.

They’re simply not convinced that voice-enabled devices are being totally responsible with their data.

There’s a way to reassure them, but it’s best to get a clear understanding of the problem first.

Why Privacy is a Mounting Concern with Voice-enabled Devices

Just about everyone knows that voice assistants collect information. In fact, this is a factor that has altogether dissuaded 28% of respondents in a Consumers International study from even purchasing a smart speaker.

Of those who already own smart speakers, 59% report having privacy concerns.

Privacy graphics with definition

There are two major issues.

The first is that people don’t know what happens to data that’s collected with their knowledge. That is, the data that the voice assistant gathers to process and respond to a request.

Most people, 75% to be exact, know the data is used to improve the voice assistant and it’s used for targeted ads, but that’s about it.

The second issue is that people worry about the possibility of data being collected while they are unaware — while they are not using the device. This has been a widely discussed point, especially with Amazon Echo.

Considering where most people keep their devices, it’s no surprise they’d have those concerns. Research by Voicebot.ai reveals that most smart speakers are kept in the living room (44.4%) and in the bedroom (37.6%).

These are places where someone might share personal details, or simply not want anyone listening in — not even a device.

Third-party Data Concerns

Apart from those issues about what voice data the smart device manufacturer may be picking up and storing, there are also worries about how it’s shared with third parties.

In a study by PWC, 38% of respondents expressed fears about receiving ads that feel “too targeted” on their voice-interactive devices.

For example, someone who asked their Google Nest speaker to get prices on tomato dip won’t likely be bothered by a grocery ad coming up next time they’re listening to a podcast.

It’s an altogether different situation, however, if they ask their smart speaker to send a text to a friend wishing the friend a happy birthday — then they get an ad about thoughtful birthday gifts.

And there’s also the fact that these voice assistants carry deeply personal information, such as biometric data on users’ voices, and intimate information about lifestyle habits because of some of the features they offer.

Users generally don’t want this getting handed off to third parties, and they don’t want it getting leveraged in any way. At most, they want it to help improve the experience of using the voice assistant — and only to the extent that is necessary.

Considering the massive and positive impact that voice user interfaces are already making in the lives of people all over the world — and the potential they can still fulfill — this is an issue very much worth addressing.

The Solution: Privacy by Design

Privacy by Design is an approach to creating and developing products that makes privacy a core focus at each phase of production, and it carries that commitment to privacy at each point the user interacts with the product.

For conversation AI, that means baking privacy-centered features into each aspect of the design process.

Instead of developing the voice user interface then thinking about embedding privacy features down the line, designers make sure privacy has been accounted for at each stage of the journey.

Privacy by design is generally presented with seven foundational principles:

Privacy by design principles
https://deviq.io/resources/articles/privacy-by-design/

● Being proactive, not reactive

● Keeping privacy as the default

● Embedding privacy into the design

● Offering full functionality

● Ensuring end to end security

● Maintaining transparency

● Respecting user privacy

Adopting Privacy by Design for Voice User Interfaces

Traditional design treats privacy as a proactive aspect to a very limited extent. For the most part, it’s a matter of having some privacy features, then waiting for breaches to happen before realizing the other protective features that are needed. It’s a reactive approach.

A proactive one serves users much better.

For example, a conversation designer may actively seek out common words that can mistakenly trigger the voice assistant to start listening for commands — then make sure the voice assistant is optimized to avoid those errors, and only start listening when it picks up its exact wake word.

This is much more proactive than relying on user feedback alone, which may come after heavy privacy breaches have already happened.

And after finding fixes for potential breaches, conversation designers ought to make them part of the default use of the voice assistant, rather than something users need to opt into.

Plus, instead of waiting for the whole application or feature set to be designed in full, this process can be a regular, stage-by-stage element of development.

Even while being protected by all the privacy protection features offered by the application, users should still retain functionality. That means conversation designers need to make the privacy features of their applications robust and scalable enough to handle complex features — especially without having to ask users for new permissions or give disclaimers.

Also, as conversation designers develop and update their applications, it’s crucial for them to tell users how they’re using data from their commands to optimize the voice interface. Transparency like this can increase user trust by 68%, according to Havard Business Review.

Though more people are warming up to and adopting voice user interfaces, not all of them are doing so with a feeling of reassurance. They are right to be concerned about their data, but conversation designers need to step in and model their applications with privacy features users can trust. Privacy by design is a crucial element of the solution.

About Digital Assistant Academy

Digital Assistant Academy provides Voice Interaction Design and Conversation Design training and Certification. In this program, we will take you from the very basics of voice interaction and conversation design, through to how voice technologies work. We’ll do a deep-dive into conversation design strategy, and it will be fully hands-on with your Capstone projects. By the end of the course, you will have two voice applications successfully designed, developed, and deployed. Learn more at Digital Assistant Academy https://www.digitalassistant.academy/

Before you go

Clap 👏 if you enjoyed this article to help me raise awareness on the topic, so others can find it too
Comment 💬 if you have a question you’d like to ask me
Follow me 👇 on Medium to read the more articles on Voice Technology and Conversation Design, and on Twitter @sprayaga

--

--

Shyamala Prayaga
Digital Assistant Academy

Shyamala Prayaga is the founder of the Digital Assistant Academy. A self-described evangelist for UX and voice technology.