Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

South Korean AI chatbot pulled from Facebook after hate speech towards minorities

Lee Luda had the persona of a 20-year-old university student

Namita Singh
Thursday 14 January 2021 12:51 GMT
Comments
Lee Luda, is a South Korea AI chatbot that was pulled down after it engaged in hate speech against sexual and racial minority
Lee Luda, is a South Korea AI chatbot that was pulled down after it engaged in hate speech against sexual and racial minority (Scatter Lab)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

A popular South Korean artificial intelligence-driven chatbot was taken down from Facebook this week, after it was accused of spewing hate speech against minorities.

Lee Luda, a chatbot designed by Korean startup Scatter Lab, had the persona of a 20-year-old woman student at a university. It had attracted more than 750,000 users since its launch in December last year. 

The company, while suspending the chatbot, apologised for its discriminatory and hateful remarks. 

“We sincerely apologise for the occurrence of discriminatory remarks against certain minority groups in the process. We do not agree with Luda's discriminatory comments, and such comments do not reflect the company's thinking,” said the Seoul-based start-up in its statement. 

The service spoke to the users by examining old chat records that it obtained from the company’s mobile application service Science of Love

Some of the users took to social media to share the racist slurs used by the AI. The chatbot can be seen calling Black people “heukhyeong,” a racist slur in South Korea and responded with “disgusting” when asked about lesbians. 

"Luda is a childlike AI who has just started talking with people. There is still a lot to learn. Luda will learn to judge what is an appropriate and better answer,” the company said in the statement.

The Scatter Lab is also facing questions over violation of privacy laws. 

It is, however, not the first time, that an AI bot has been embroiled in a controversy related to discrimination and bigotry. In 2016, Microsoft was forced to shut down its chatbot Tay within 16 hours of its launch, after the bot was manipulated into saying Islamophobic and white supremacist slurs.  

In 2018, Amazon’s AI recruitment tool was also suspended after the company found that it made recommendations that were biased against women.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in