A group of researchers from the Tokyo-based University of Electro-Communications and the University of Michigan have discovered that Siri, Alexa, and Google Assistant are all vulnerable to attacks that use lasers to inject inaudible — and sometimes invisible — commands into the devices and surreptitiously cause them to unlock doors, visit websites, and locate, unlock, and start vehicles, among other things.
“It’s possible to make microphones respond to light as if it were sound,” says cybersecurity researcher Takeshi Sugawara. “This means that anything that acts on sound commands will act on light commands.”
The list of devices included Google Home, various Amazon Echo models, the Apple HomePod, and Facebook’s Portal speaker, which runs Alexa, explains Andy Greenberg of Wired. They also tested an iPhone XR, a Samsung Galaxy S9, and a Google Pixel 2.
The team found all were vulnerable to the attack, in varying degrees. They were able to hijack the tablets, phones, and speakers from some distance — and through windows. They hijacked a Google Home speaker from 110 meters away, for example.
Some of the devices were less vulnerable than others. Some Android smartphones, the iPhone, and iPad all require additional authentication or a “wake word” from the user before carrying out certain actions. Would-be hijackers would need to re-create a person saying a wake command like “Hey Siri” or “OK Google” to wake up an assistant before they could carry out an attack.
The researchers are now working with Google, Apple, Amazon and others to mitigate the issue, suggesting that the smart speaker makers fix this vulnerability by adding a light shield around the microphone or using two different microphones on opposite sides to listen to voice commands.
A Google spokesperson told Wired that the company is “closely reviewing” the research paper. “Protecting our users is paramount, and we’re always looking at ways to improve the security of our devices,” the spokesperson added.