Alexa, Google Home and Siri can be hacked with lasers, researchers find

‘This opens up an entirely new class of vulnerabilities,’ warn scientists

Nicole Perlroth
Tuesday 05 November 2019 13:25 GMT
Comments
(AFP/Getty)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Since voice-controlled digital assistants were introduced a few years ago, security experts have fretted that systems like Apple’s Siri and Amazon’s Alexa were a privacy threat and could be easily hacked.

But the risk presented by a cleverly pointed light was probably not on anyone’s radar.

Researchers in Japan and at the University of Michigan said Monday they had found a way to take over Google Home, Amazon’s Alexa or Apple’s Siri devices from hundreds of feet away by shining laser pointers, and even torches, at the devices’ microphones.

In one case, they said they opened a garage door by shining a laser beam at a voice assistant that was connected to it. They also climbed 140 feet to the top of a bell tower at the University of Michigan and successfully controlled a Google Home device on the fourth floor of an office building 230 feet away. And by focusing their lasers using a telephoto lens, they said, they were able to hijack a voice assistant more than 350 feet away.

Opening the garage door was easy, the researchers said. With the light commands, the researchers could have hijacked any digital smart systems attached to the voice-controlled assistants.

They said they could have easily switched light switches on and off, made online purchases or opened a front door protected by a smart lock. They even could have remotely unlocked or started a car that was connected to the device.

“This opens up an entirely new class of vulnerabilities,” said Kevin Fu, an associate professor of electrical engineering and computer science at the University of Michigan. “It’s difficult to know how many products are affected because this is so basic.”

The computer science and electrical engineering researchers – Takeshi Sugawara at the University of Electro-Communications in Japan; and Fu, Daniel Genkin, Sara Rampazzi and Benjamin Cyr at the University of Michigan – released their findings in a paper Monday.

The researchers said they notified Tesla, Ford, Amazon, Apple and Google to the light vulnerability. The companies all said they were studying the conclusions in the paper.

The New York Times

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in