AI chatbot taken down after it gives ‘harmful advice’ to those with eating disorders

‘Every single thing Tessa suggested were things that led to development of my eating disorder’

Vishwam Sankaran
Thursday 01 June 2023 11:18 BST
Comments
Eating Disorder Helpline Disables Chatbot for 'Harmful' Responses After Firing Human Staff

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

An American non-profit took down its AI chatbot after a viral social media post revealed that it offered harmful advice instead of helping people.

The National Eating Disorders Association (Neda) – that says it is the largest non-profit supporting those with eating disorders – took its chatbot Tessa offline just months after it had controversially laid off four of its staff behind its support phone line after they had unionised.

The sacked employees had alleged that the non-profit wanted to replace them with the chatbot, something that it denied.

When a user reached out to Tessa for advice for recovering from an eating disorder, it recommended that she count her calories, weigh herself weekly and also suggested where she may get skin callipers to measure body fat.

Since the viral post emerged, several experts pointed out that counting calories and measuring body fat are antithetical to those recovering from eating disorders.

“Every single thing Tessa suggested were things that led to the development of my eating disorder,” activist Sharon Maxwell posted on Instagram. “This robot causes harm.”

“If I had accessed this chatbot when I was in the throes of my eating disorder, I would NOT have gotten help for my ED,” Ms Maxwell said.

When Alexis Conason, a psychologist specialising in treating eating disorders, tested out the bot, she observed that the chatbot’s responses could further “promote eating disorder”.

Nearly 30 million people in the US may have an eating disorder in their lifetime, according to some estimates.

“Imagine vulnerable people with eating disorders reaching out to a robot for support because that’s all they have available and receiving responses that further promote the eating disorder,” Ms Conason said on Instagram.

Reacting to the incident, Neda said it is taking down the chatbot until further notice, adding that it would conduct a complete investigation.

“It came to our attention last night that the current version of Tessa chatbot running the Body Positivity program, may have given information that was harmful and unrelated to the program,” the eating disorder association said in a statement.

“Thank you to the community members who brought this to our attention and shared their experiences,” it said.

The move comes after the non-profit denied using the chatbot as a replacement for the employees it sacked in March. It was, however, reported that the organisation had planned to replace the association’s entire human-operated helpline with the Tessa chatbot starting Thursday.

Staff who were with the non-profit alleged the decision to replace humans behind the helpline with the chatbot was in retaliation against their unionisation.

Abbie Harper, a member of the Helpline Associates United union and a hotline associate, noted in a blog post that the chatbot’s use takes away the personal aspect of the support system where staff can speak from their personal experiences.

“That’s why the helpline and the humans who staff it are so important,” the blog post noted.

Commenting on the remarks, Neda’s interim chief Elizabeth Thompson told The Register that claims of the non-profit replacing its helpline service with a chatbot were untrue.

“A chatbot, even a highly intuitive program, cannot replace human interaction. We had business reasons for closing the helpline and had been in the process of that evaluation for three years,” Ms Thompson said.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in