Alexa can be hacked–by chirping birds?

Scientists at the Ruhr-Universitaet in Bochum, Germany, have discovered a way to hide inaudible commands in audio files – commands that, while imperceptible to our ears, can take control over voice assistants. According to the researchers behind the technology, the flaw is in the very way AI is designed.

It’s part of a growing area of research known as “adversarial attacks,” which are designed to confuse deep neural networks–usually visually, as Co.Design has covered in the past–leaving them potentially vulnerable to attacks by bad-faith actors on the technology and infrastructure in our world that depends on AI to function.

In this case, the system being “attacked” by researchers at the Ruhr-Universität Bochum are personal assistants, like Alexa, Siri, or Cortana. According to Professor Thorsten Holz from the Horst Görtz Institute for IT Security, their method, called “psychoacoustic hiding,” shows how hackers could manipulate any type of audio wave–from songs and speech to even bird chirping–to include words that only the machine can hear, allowing them to give commands without nearby people noticing. The attack will sound just like a bird’s call to our ears, but a voice assistant would “hear” something very different.

Attacks could be played over an app, for instance, or on a TV commercial or radio program, to hack thousands of people–and potentially make purchases with or steal their private information. “[In] a worst-case scenario, an attacker may be able to take over the entire smart home system, including security cameras or alarm systems,”

“An Amazon spokesperson told Co.Design that they take security issues seriously, and that the company is “reviewing the findings by the researchers.” Another way to look at this problem? Whenever possible–and unfortunately, it’s not always possible–don’t use unsecured smart speakers for sensitive information until they deliver on the promise of a secure and safe user experience.”

Sources/Further Reading:

Fast Company: Alexa can be hacked–by chirping birds

Adversarial Attacks Against ASR Systems via Psychoacoustic Hiding

Lea Schönherr, Katharina Kohls, Steffen Zeiler, Thorsten Holz, and Dorothea Kolossa, Ruhr-Universität Bochum,  Technical Paper