Alexa Can Be Hacked–By Chirping Birds

Researchers were able to attack a common speech recognition system using voice commands hidden in other audio recordings

Fast Company
Fast Company

--

Photo: Rahul Chakraborty/Unsplash

By Jesus Dias

Scientists at the Ruhr-Universitaet in Bochum, Germany, have discovered a way to hide inaudible commands in audio files–commands that, while imperceptible to our ears, can take control over voice assistants. According to the researchers behind the technology, the flaw is in the very way AI is designed.

It’s part of a growing area of research known as “adversarial attacks,” which are designed to confuse deep neural networks–usually visually, as Co.Design has covered in the past–leaving them potentially vulnerable to attacks by bad-faith actors on the technology and infrastructure in our world that depends on AI to function.

In this case, the system being “attacked” by researchers at the Ruhr-Universität Bochum are personal assistants, like Alexa, Siri, or Cortana. According to Professor Thorsten Holz from the Horst Görtz Institute for IT Security, their method, called “psychoacoustic hiding,” shows how hackers could manipulate any type of audio wave–from songs and speech to even bird chirping–to include words that only the machine can hear, allowing them to give commands without nearby people…

--

--

Fast Company
Fast Company

Official Medium account for the Fast Company business media brand; inspiring readers to think beyond traditional boundaries & create the future of business.