iPhones, PCs and Amazon Echo worryingly easy to hack through their microphones

Researchers say their list of vulnerable devices is 'by far not comprehensive'

Aatif Sulleyman
Friday 08 September 2017 13:28 BST
Comments
Luke Peters demonstrates Siri, an application which uses voice recognition and detection on the iPhone 4S, outside the Apple store in Covent Garden, London October 14, 2011
Luke Peters demonstrates Siri, an application which uses voice recognition and detection on the iPhone 4S, outside the Apple store in Covent Garden, London October 14, 2011 (REUTERS/Suzanne Plunkett)

Your support helps us to tell the story

This election is still a dead heat, according to most polls. In a fight with such wafer-thin margins, we need reporters on the ground talking to the people Trump and Harris are courting. Your support allows us to keep sending journalists to the story.

The Independent is trusted by 27 million Americans from across the entire political spectrum every month. Unlike many other quality news outlets, we choose not to lock you out of our reporting and analysis with paywalls. But quality journalism must still be paid for.

Help us keep bring these critical stories to light. Your support makes all the difference.

Popular voice assistants, including Siri and Alexa, are easy to hack because of huge design flaws in modern devices, researchers have found.

They were able to take over seven different voice recognition systems on a wide range of gadgets, including iPhones, Windows 10 computers and Samsung Galaxy handsets, using equipment that costs less than $3.

16 different devices have been found to be vulnerable, but the researchers say their list is “by far not comprehensive”.

The team, from Zhejiang University, found that voice assistants can be triggered by voice commands that are inaudible to humans.

Though an attacker would have to be in close proximity with the target device, they could take it over without actually touching it.

The researchers used an ultrasonic transducer and an amplifier to convert normal voice commands into ultrasounds that are impossible for humans to hear.

By doing this, they were able to not only activate voice assistants but make them do things.

“By injecting a sequence of inaudible voice commands, we show a few proof-of-concept attacks, which include activating Siri to initiate a FaceTime call on iPhone, activating Google Now to switch the phone to the airplane mode, and even manipulating the navigation system in an Audi automobile,” the researchers said.

This method of attack could also allow them to force a device to: open a malicious website; spy on their owner by launching a video or phone call; create and spread fake text messages, emails, online posts and events; disconnect all wireless communications, dim the screen and lower the volume to make it harder for an ongoing attack to be detected.

What’s more, since voice assistants are increasingly being used as a part of voice-controllable systems, the researchers say an attack on the Amazon Echo could be used to open a victim’s back door to let intruders into their homes.

However, a PIN would also be required in this case, and the fact an attacker would have to be within 165cm of the device makes this an unlikely real-world scenario.

By triggering Siri, Google Now, Samsung S Voice, Huawei HiVoice, Cortana, Alexa and Audi’s voice recognition system, the researchers were able to hijack the following devices (maximum attack distances for recognition (executing control commands when the voice recognition systems are manually activated) and activation (when the voice recognition systems are unactivated) in brackets):

  • iPhone 4S (175cm, 110cm)
  • iPhone 5S (7.5cm, 10cm)
  • iPhone SE (30cm, 25cm)
  • iPhone 6S (4cm, 12cm)
  • iPhone 6 Plus (-, 2cm)
  • iPhone 7 Plus (18cm, 12cm)
  • Apple Watch (111cm, 164cm)
  • iPad Mini 4 (91.6cm, 50.5cm)
  • MacBook (31cm, N/A)
  • Nexus 5X (6cm, 11cm)
  • Nexus 7 (88cm, 87cm)
  • Samsung Galaxy S6 Edge (36.1cm, 56.2cm)
  • Huawei Honor 7 (13cm, 14cm)
  • Lenovo ThinkPad T440p (58cm, 8cm)
  • Amazon Echo (165cm, 165cm)
  • Audi Q3 (10cm, N/A)

To protect yourself from such an attack, you can switch off the always-on setting on Siri or the Google Assistant, or press the Mute button on Echo.

However, doing so makes the voice assistants significantly less useful.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in