Smart speakers could accidentally record users up to 19 times per day, study reveals

TV shows like Gilmore Girls and The Office 'responsible for the majority of activations'

Anthony Cuthbertson
Monday 24 February 2020 13:18 GMT
Comments
(AFP/Getty)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Smart speakers like the Amazon Echo, Apple HomePod and Google Home are being inadvertently triggered by popular TV shows to record private conversation, a study has revealed.

Researchers at Northeastern University and Imperial College London made the discovery after playing 125 hours of Netflix content to see if the voice assistants were activated by dialogue that sounded like wake words.

Activations occurred up to 19 times per day, with HomePod and Microsoft’s Cortana assistant most susceptible to accidental recordings. The experiment also found that Gilmore Girls and The Office were “responsible for the majority of activations” due to the large amount of dialogue in the shows.

“Anyone who has used voice assistants knows that they accidentally wake up and record when the ‘wake word’ isn’t spoken – for example, ‘seriously’ sounds like the wake word ‘Siri’ and often causes Apple’s Siri-enabled devices to start listening,” the researchers wrote.

“There are many other anecdotal reports of everyday words in normal conversation being mistaken for wake words... Our team has been conducting research to go beyond anecdotes through the use of repeatable, controlled experiments that shed light on what causes voice assistants to mistakenly wake up and record.”

The experiment involved placing an Amazon Echo Dot, Apple Homepod, Google Home Mini and Microsoft smart speaker in a box and playing the audio from TV shows to see if it prompted their wake up words: “Alexa”; Hey Siri”; “OK Google”; and “Cortana”.

Several patterns emerged for non wake words triggering the devices, such as words that rhymed with the words or sounded similar. For example, the Amazon Echo Dot was activated when it mistook “kevin’s car” for “Alexa”.

The researchers also referred to privacy concerns that these devices have raised in recent years, claiming that their study suggests “these aren’t just hypothetical concerns from paranoid users”.

Google Home Mini, Apple Homepod, Amazon Echo Dot and Microsoft’s Harman Kardon were used in the experiment
Google Home Mini, Apple Homepod, Amazon Echo Dot and Microsoft’s Harman Kardon were used in the experiment (Northeastern University)

In December, a prominent privacy epert warned that Amazon Echo speakers should not be placed in people’s bedrooms due to the risk of inadvertent spying.

Tech expert Hannah Fry revealed that she did not use the smart speaker in upstairs rooms of her house after a data request revealed that it had eavesdropped on private conversations.

“I think there are some spaces in your home, like the bedroom and bathroom, which should remain completely private,” she said at the time.

“There are people who are very senior in the tech world who will not have so much as a smartphone in their bedroom. If a company is offering you a device with an internet-connected microphone at a low price, you have to think about that very carefully.”

Earlier this month, a former Amazon executive revealed that he disabled his Alexa-powered smart speaker when he “didn’t want certain conversations to be heard by humans”.

Ex-AWS manager Robert Fredrick made the revelation after Amazon admitted to employees listening to customer voice recordings made by its Alexa voice assistant.

Amazon said recordings were used to train its “speech recognition and natural language understanding systems” and improve its AI assistant.

“We take the security and privacy of our customers’ personal information seriously,” the firm said.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in