Amazon Alexa tells 10-year-old child to give herself an electric shock for a ‘challenge’

Amazon says that it has now removed the challenge from its database

Adam Smith
Wednesday 29 December 2021 18:50 GMT
Comments
Voice assistants such as Google Assistant, Siri and Alexa get their information from common search engines, but do not have the ability to effectively check the information
Voice assistants such as Google Assistant, Siri and Alexa get their information from common search engines, but do not have the ability to effectively check the information (Getty)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Amazon’s Alexa voice assistant recommended to a 10-year-old that she give herself an electric shock as part of a “challenge”.

Kristin Livdahl posted on Twitter that the voice assistant recommended the action after her daughter asked for a challenge.

“Here’s something I found on the web”, Alexa replied. “The challenge is simple: plug in a phone charger about halfway into a wall outlet, then touch a penny to the exposed prongs.”

Ms Livdahl said that she and her daughter were doing some “physical challenges” and that her daughter wanted another one.

“I was right there and yelled, ‘No, Alexa, no!’ like it was a dog. My daughter says she is too smart to do something like that anyway”, she tweeted.

Amazon says that it has now removed the challenge from its database.

“Customer trust is at the centre of everything we do and Alexa is designed to provide accurate, relevant, and helpful information to customers,” an Amazon spokesperson said in a statement. “As soon as we became aware of this error, we took swift action to fix it.”

Voice assistants such as Google Assistant, Siri and Alexa get their information from common search engines, but do not have the ability to effectively check the information – and as such can provide false or offensive results.

In December 2020, Alexa was found to be repeating conspiratorial and racist remarks. Asked if Islam is evil, one result returned by Alexa was: “Here’s something I found on the web. According to [a website], Islam is an evil religion.”

In 2018, Apple’s Siri voice assistant thought that Donald Trump was a penis, due to someone vandalising the then US president’s Wikipedia page and Siri pulling the information from there.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in