Simple Design Flaw Makes Siri and Alexa Vulnerable to Hacks

According to a group of researchers from Zhejiang University in China, a critical design flaw in voice assistants from Apple and Amazon makes them extremely vulnerable to hacks (via FastCompany). It seems that hackers can take control of Siri and Alexa by whispering in frequencies humans can’t hear, by translating typical audio commands into ultrasonic frequencies using the “DolphinAttack” method.

Dolphin

The source explains that this relatively simple translation process lets them take control of gadgets with just a few words uttered in frequencies none of us can hear. The researchers said that the commands could be anything from a simple command like “Hey Siri” to forcing the iPhone to open a malicious website or Amazon’s Echo could be asked to “open the backdoor”. Attackers can even push an Audi Q3 to change its navigational system settings and add a new location.

In other words, Silicon Valley has designed human-friendly UI with a huge security oversight. While we might not hear the bad guys talking, our computers clearly can. “From a UX point of view, it feels like a betrayal,” says Ame Elliott, design director at the nonprofit SimplySecure. “The premise of how you interact with the device is ‘tell it what to do,’ so the silent, surreptitious command is shocking.

To hack each voice assistant, the researchers used a smartphone with about $3 of additional hardware, including a tiny speaker and amp. In theory, their methods, which are now public, are duplicatable by anyone with a bit of technical know-how and just a few bucks in their pocket.

Their research will be presented at the ACM Conference on Computer and Communications Security in Dallas, Texas from October 30th to November 3rd. In the meantime, check out the video demonstration below and tell us what you think.

YouTube video

P.S. Help support us and independent media here: Buy us a beer, Buy us a coffee, or use our Amazon link to shop.