Share:

Research Finds Alexa, Siri and Google Assistant Susceptible to Inaudible Commands

Share:

Researchers in China and the United States have discovered that Apple’s Siri, Amazon’s Alexa, and Google’s Assistant can be controlled by hidden commands undetectable to the human ear, The New York Times is reporting. The researchers were able to secretly activate the three AI assistants, making them dial phone numbers or open websites.

Amazon

The researchers say criminals could exploit the technology to unlock doors, wire money or buy stuff online, simply with music playing over the radio. Back in 2016, students from University of California, Berkeley showed that they could hide commands in white noise played over loudspeakers and through YouTube videos to get smart devices to turn on airplane mode or open a website.

Last year, researchers at Princeton University and China’s Zhejiang University demonstrated that voice-recognition systems could be activated by using frequencies inaudible to the human ear. The attack first muted the phone so the owner wouldn’t hear the system’s responses either.

The technique, which the Chinese researchers called DolphinAttack, can instruct smart devices to visit malicious websites, initiate phone calls, take a picture or send text messages. While DolphinAttack has its limitations — the transmitter must be close to the receiving device — experts warned that more powerful ultrasonic systems were possible.

“Companies have to ensure user-friendliness of their devices, because that’s their major selling point,” said Tavish Vaidya, a researcher at Georgetown. “We want to demonstrate that it’s possible,” he said, “and then hope that other people will say, ‘O.K. this is possible, now let’s try and fix it.’”

Do you think a simple software patch issued by Apple, Amazon or Google will be enough to remove the security risks from their respective assistants?

Share: