ChatGPT built with help of underpaid, exploited Kenyan employees, report alleges
Kenyan workers were tasked with labelling content from ‘darkest recesses of the internet’, TIME reports
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.OpenAI’s chatbot ChatGPT was reportedly built using vital contributions from outsourced, underpaid Kenyan labourers.
The chatbot was built with help from a Kenya-based data labeling team who earned less than $2 per hour, according to an investigation by TIME.
Outsourced Kenyan workers were also subject to graphic sexual content to clean the platform of violence and hate speech.
The labourers were sent snippets of text for labelling from the “darkest recesses of the internet” depicting graphic content like “child sexual abuse, bestiality, murder, suicide, torture, self harm, and incest”, TIME reported.
Workers reportedly read hundreds of these kinds of entries each day for wages that raged from $1 to $2 an hour, or a $170 monthly salary.
The Kenyan team was managed by Sama, a San Francisco-based firm, which said its workers could take advantage of both individual and group therapy sessions with “professionally-trained and licensed mental health therapists”.
One worker who was tasked with text labelling text told TIME he suffered from recurring visions after reading a graphic description of a man having sex with a dog. “That was torture,” he said.
Sama reportedly ended all its contracted work for OpenAI in February 2022, much earlier than planned.
In December last year, ChatGPT gained prominence for its “mind-blowing” ability to respond to a range of queries with human-like text output with researchers across the world praising the AI’s general purpose language model.
Several users also pointed out the seemingly effective system in the chatbot preventing it from racist or violent content output.
Its launch led to widespread speculation that it could revolutionise industries and may likely even replace tools like Google’s search engine.
But several institutions and scholars had also raised concerns about the widespread use of the AI chatbot disrupting academia.
The New York City education department said it was worried about the negative impacts of ChatGPT on student learning, amid “concerns regarding the safety and accuracy of content.”
“There will be scary moments as we move towards AGI-level systems, and significant disruptions, but the upsides can be so amazing that it’s well worth overcoming the great challenges to get there,” OpenAI chief Sam Altman wrote in a Twitter thread.
“There are going to be significant problems with the use of OpenAI tech over time; we will do our best but will not successfully anticipate every issue,” he said.
Some companies, including Google, had previously warned that releasing such an AI technology for widespread use may pose risks due to inbuilt biases and misinformation.
But Mr Altman held that the AI technology would be necessary for humanity “to fully understand the universe.”
OpenAI and Sama did not immediately respond to several approaches for comment by The Independent.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments