Siri, Alexa, Google Now Can Be Hacked Using Inaudible Commands

Updated on

If you are planning to buy a device with a voice assistant or have already bought one, this new warning could force you to rethink your opinion when it comes to these machines. In a new research study, scientists from China’s Zhejiang University claim that Siri, Alexa, Google Now and other voice-activated gadgets can be controlled by feeding them commands in ultrasonic frequencies.

A new way for hackers to trouble you

Though the idea of using the high-pitched frequencies to hack voice assistants is not being heard for the first time, the research from Zhejiang offers the most detailed information on the concept to date. It should also ring bells underlining the growing vulnerability of modern day technology.

For the test, researchers first developed a program that converts a command given in a normal voice into high-pitch frequencies not audible to human ears. Researchers then tested the commands on 16 voice-controlled systems, including Cortana, Siri, Alexa, Google Now, Samsung S Voice and some in-car interfaces.

The researchers termed their method “Dolphin Attack,” and they were able to get the voice assistants to open malicious websites and even doors when there is a smart lock connected. The researchers were able to successfully test commands like “Open the back door,” “Call 123-456-7890,” and “open Dolphinattack.com.” They were also able to alter the navigation on an Audi Q3.

What’s even scarier is that “the inaudible voice commands can be correctly interpreted by the SR (speech recognition) systems on all the tested hardware,” the researchers said. It means that the commands will work even if the hacker has no access to the device and the owner has installed the necessary security defenses.

Further, the research proves that the problem is with both the hardware and software. The hardware allows the device to pick up the ultrasonic sounds, while the software fails to differentiate between a human and a computer-generated voice.

Are Siri, Alexa, Google Now, etc. really vulnerable?

Such a finding means that hackers may have a new way to trouble you by accessing your phones, cars, tablets, etc.

Amme Elliott, design director at the nonprofit SimplySecure, told Fast Company, “I think Silicon Valley has blind spots in not thinking about how a product may be misused. It’s not as robust a part of the product planning as it should be.”

However, users should not worry, as there are limitations to the technique. To carry out such an attack, the device has to be very near the hacker, users must leave their smartphone and voice assistant activated, and users must not be around their gadgets when such hacking is taking place. All these scenarios happening together is very unlikely, but it’s not impossible.

In most of the cases, the gadgets picked up the frequency only if it was coming from a few inches away. In some instances, however, researchers carried out the attack from several feet away, for example, with the Apple Watch, notes Telegraph.

According to researchers, a permanent solution to such a vulnerability is for device makers to program their AI assistants to ignore frequencies above 20kHz or pay no heed to frequencies that humans can’t understand.

Leave a Comment