Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Apps that create deepfake nudes should be banned, says online safety group

Internet Matters has urged the Government to respond to a study which says thousands of teenagers have encountered an AI-made deepfake nude image.

Martyn Landi
Tuesday 22 October 2024 00:01 BST
AI-powered ‘nudifying’ apps which can create non-consensual explicit images of people, including children, should be banned, an online safety charity has said (Tim Goode/PA)
AI-powered ‘nudifying’ apps which can create non-consensual explicit images of people, including children, should be banned, an online safety charity has said (Tim Goode/PA) (PA Wire)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

AI-powered “nudifying” apps which can create non-consensual explicit images of people, including children, should be banned, an online safety charity has said.

Internet Matters has called on the Government to strengthen the Online Safety Act to ban tools which can create deepfake nudes after a study from the group estimated that as many as half a million children have encountered such images online.

It said its research had found a growing fear among young people over the issue, with 55% of teenagers saying it would be worse to have a deepfake nude of them created and shared than a real image.

Strengthening the new online safety laws and new legislation to ban nudifying tools are necessary because current legislation is not keeping pace, Internet Matters said, arguing that the AI models used to generate sexual images of children are not currently illegal in the UK, despite possession of such an image being a criminal offence.

Children have told us about the fear they have that this could happen to them without any knowledge and by people they don’t know. They see deepfake image abuse as a potentially greater violation because it is beyond their control

Carolyn Bunting, Internet Matters

Earlier this month, online safety watchdog the Internet Watch Foundation (IWF) warned that AI-generated child sexual abuse content is now being increasingly found on the open, public web, rather than hidden away on dark web forums.

Internet Matters said it estimates that 99% of deepfake nudes feature women and girls, and warned the content is being used to facilitate child-on-child sexual abuse, adult perpetrated sexual abuse, and sextortion.

Internet Matters co-chief executive Carolyn Bunting said: “AI has made it possible to produce highly realistic deepfakes of children with the click of a few buttons.

“Nude deepfakes are a profound invasion of bodily autonomy and dignity, and their impact can be life-shattering.

“With nudifying tools largely focused on females, they are having a disproportionate impact on girls.

Children have told us about the fear they have that this could happen to them without any knowledge and by people they don’t know. They see deepfake image abuse as a potentially greater violation because it is beyond their control.

Deepfake image abuse can happen to anybody, at any time. Parents should not be left alone to deal with this concerning issue.

“It is time for Government and industry to take action to prevent it by cracking down on the companies that produce and promote these tools that are used to abuse children.”

The safety organisation’s study involved surveying 2,000 parents of children aged three to 17, and 1,000 children aged nine to 17, in the UK.

It found that teenage boys are twice as likely to report an experience with a nude deepfake. However, boys are more likely to be the creators of deepfake nudes, and girls are more likely to be the victims.

The study also indicated support among both children and parents for more education around deepfakes, with 92% of teenagers and 88% of parents saying they believe children should be taught about the risks of the technology in school.

Online deepfake abuse disproportionately impacts woman and girls online, which is why we will work with Internet Matters and other partners to address this as part of our mission to halve violence against women and girls over the next decade

Safeguarding minister Jess Phillips

Minister for safeguarding and violence against women and girls Jess Phillips said: “This Government welcomes the work of Internet Matters, which has provided an important insight into how emerging technologies are being misused.

“The misuse of AI technologies to create child sexual abuse material is an increasingly concerning trend.

“Online deepfake abuse disproportionately impacts woman and girls online, which is why we will work with Internet Matters and other partners to address this as part of our mission to halve violence against women and girls over the next decade.

“Technology companies, including those developing nudifying apps, have a responsibility to ensure their products cannot be misused to create child sexual abuse content or non-consensual deepfake content.”

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in