Helpline unable to remove tens of thousands of revenge porn images
Exclusive: ‘The behaviour of sharing intimate images without consent is illegal, but the content itself is not illegal,’ says revenge porn helpline manager
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Tens of thousands of revenge porn images and footage cannot be removed from the internet due to a “loophole” in the law, a new study has found.
Research by the Revenge Porn Helpline, shared exclusively with The Independent, found the internet currently has 30,000 website pages containing intimate image abuse material which they can’t take down.
Campaigners warn tech platforms can refuse to remove content due to not being legally obliged to do so as they called for ministers to urgently overhaul the law in this area.
Researchers found around 10 per cent of reported instances of intimate image abuse material remains online, even if the perpetrator has been convicted for sharing the content.
Sophie Mortimer, a manager at the helpline, said: “Sharing intimate images without consent is illegal, but the content itself is not illegal, so some platforms simply don’t cooperate with our request to remove imagery and footage.”
She explained the majority of platforms - such as the main social media platforms and prominent adult sites - are very cooperative about taking material down but others ignore their requests.
Some of these platforms’ business models are centred around sharing and re-sharing intimate image-based abuse, Ms Mortimer said.
“On the periphery of the internet which exists to share this content, they do not cooperate,” she added. “Sometimes, they are not hosted in the UK. They are hosted in countries where there is little regulation and enforcement such as Russia, parts of Asia and South America. Telegram ignores our requests to take down imagery.”
She noted internet service providers would be able to block revenge porn if ministers were to change the law - adding that campaigners have been raising concerns about the “legal loophole” for years.
Ms Mortimer added: “Women are disproportionately affected and shamed and humiliated by intimate image abuse. It is the violation of someone’s trust if that image was shared without their consent.
“It results in broken relationships, both intimate partner and family relationships and friendships because it is very common for people to victim-blame. We still hear the response to someone making this is kind of disclosure of ‘why did you send a nude in the first place?’”
Ms Mortimer warned revenge porn is a far more prevalent problem than many realise.
“Intimate image-based abuse can result in deteriorating mental health and also in people losing jobs - often we see images have been circulated around people’s workplaces to cause maximum harm,” she added. “Sometimes people are so overwhelmed with shame they can’t go to work or even leave their homes.”
Researchers found reports to the Revenge Porn Helpline surged by 106 per cent over 2023 with women having around 28 times more images shared than men.
The study found almost three quarters of incidents involving threats to share intimate material involved women, while women were the victims in almost eight in ten of all voyeurism cases.
Researchers found the victims are women in 95 per cent of cases needing content to be reported by the helpline.
A spokesperson for the Department for Science Innovation and Technology said: “Once implemented the Online Safety Act will require sites to block access to websites hosting illegal non-consensual intimate images if ordered to by a court via Ofcom’s powers.
“We are also cracking down on abusers who share intimate images of someone without their consent, by giving police and prosecutors the powers they need to bring these cowards to justice.”
But Ms Mortimer warned although Ofcom has powers to force internet service providers to block sites sharing illegal content, it is a “lengthy and convoluted process” which profoundly lets down victims.
She added: “And currently, non-consensually shared images are not illegal so wouldn’t qualify.”
A spokesperson for Telegram has been contacted for comment.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments