Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Girl, 15, calls for criminal penalties after classmate made deepfake nudes of her and posted on social media

Teenager Elliston Berry is calling on lawmakers to require that tech companies remove AI-generated pornography within two days of a report

Katie Hawkinson
Tuesday 25 June 2024 17:50 BST
Comments
Elliston Berry (left) and her mother Anna McAdams (right) say federal protections are needed to protect victims of deep-fake porn. Berry was just 14 when a classmate distributed AI-generated nude images of her last year
Elliston Berry (left) and her mother Anna McAdams (right) say federal protections are needed to protect victims of deep-fake porn. Berry was just 14 when a classmate distributed AI-generated nude images of her last year (CNN)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Last October, 14-year-old Elliston Berry woke up to a nightmare.

The teen’s phone was flooded with calls and texts telling her that someone had shared fake nude images of her on Snapchat and other social media platforms.

“I was told it went around the whole school,” Berry, from Texas, told Fox News. “And it was just so scary going through classes and attending school, because just the fear of everyone seeing these images, it created so much anxiety.”

The photos were AI-generated - what’s known as deepfakes. These generated images and videos have become frighteningly prevalent in recent years. Deepfakes are made to look hyper-realistic and are often used to impersonate major public figures or create fake pornography. But they can also cause significant harm to regular people.

Berry, now 15, is calling on lawmakers to write criminal penalties into law for perpetrators to protect future victims of deepfake images.

The teen told the outlet that after discovering what had happened, she immediately went to her parents. Her mother, Anna McAdams, told Fox News she knew the images were fake. McAdams then reached out to Snapchat several times over an eight-month period to have the photos removed.

Elliston Berry (left) and her mother Anna McAdams (right) say federal protections are needed to protect victims of deep-fake porn. Berry was just 14-years-old when a classmate distributed AI-generated nudes of her last year
Elliston Berry (left) and her mother Anna McAdams (right) say federal protections are needed to protect victims of deep-fake porn. Berry was just 14-years-old when a classmate distributed AI-generated nudes of her last year (CNN)

While the deepfakes of Berry were eventually taken down, McAdams told CNN, the classmate who distributed them is facing few repercussions.

“This kid who is not getting any kind of real consequence other than a little bit of probation, and then when he’s 18, his record will be expunged, and he’ll go on with life, and no one will ever really know what happened,” McAdams told CNN.

This week, Republican Senator Ted Cruz, Democratic Senator Amy Klobuchar and several colleagues co-sponsored a bill that would require social media companies to take down deep-fake pornography within two days of getting a report.

The Take It Down Act would also make it a felony to distribute these images, Cruz told Fox News. Perpetrators who target adults could face up to two years in prison, while those who target children could face three years.

Cruz said that what happened to Berry “is a sick and twisted pattern that is getting more and more common.”

“[The bill] puts a legal obligation on the big tech companies to take it down, to remove the images when the victim or the victim's family asks for it,” Cruz said. “Elliston's Mom went to Snapchat over and over and over again, and Snapchat just said, ‘Go jump in a lake.’ They just ignored them for eight months.”

A spokesperson for Snap Inc, Snapchat’s parent company, said the platform does not allow pornography and has policies that prohibit deepfakes and bullying.

The mom and daughter say legislation is essential to protecting future victims, and could have meant more serious consequences for the classmate who shared the deep-fakes.

“If [this law] had been in place at that point, those pictures would have been taken down within 48 hours, and he could be looking at three years in jail...so he would get a punishment for what he actually did,” McAdams told CNN.

While the photos are now off Snapchat, Berry says she is terrified they will resurface if the Take it Down Act is passed.

“It’s still so scary as these images are off Snapchat, but that does not mean that they are not on students’ phones, and every day I’ve had to live with the fear of these photos getting brought up resurfacing,” Berry said. “By this bill getting passed, I will no longer have to live in fear knowing that whoever does bring these images up will be punished.”

This article was updated with comment from Snap Inc.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in