Children filming themselves in graphic sexual videos for 'likes' online in growing trend
New trend revealed as a record number of webpages are reported to the Internet Watch Foundation in 2019
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.A third of child sex abuse images are originally posted online by children themselves, it has emerged – with warnings of a growing trend where minors share graphic footage for “likes”.
The Internet Watch Foundation (IWF) said the past year has seen a “significant change” in the amount of self-generated images, which are mostly taken by girls aged between 11 and 13.
A record of 260,400 web pages were reported in 2019, of which 132,700 showed children being sexually abused.
The IWF said that because each page may contain thousands of files, the reports equate to millions of images.
Its chief executive, Susie Hargreaves OBE, said a third of the images had originally been taken by children themselves amid an increasing proportion of “self-generated” content.
“Often the girls are looking at the camera, on sites where they’re reading instructions and doing things for likes,” she told The Independent.
“You can see them looking into a screen and say ‘I won’t do that unless I get 1,000 likes’.
“They are incredibly vulnerable and in some ways they are actively participating in this abuse ... it’s a terribly coercive and manipulative relationship.”
Ms Hargreaves said that depending on the platform, children may be receiving instructions from one person, or multiple users.
Some of the children may believe they are streaming to an online boyfriend or friend, she said, although others are “manipulated” into broadcasting themselves before the images are shared more widely through paedophile networks.
Ms Hargreaves said some of the children are motivated by perceived online popularity, “likes” or rankings on streaming platforms.
Much of the footage appears to be taken in children’s bedrooms, with one video interrupted by an adult knocking on the door to tell the victim dinner was ready.
“We’ve seen a really big rise in it in the last year,” Ms Hargreaves warned.
“My message to anyone is if a child is in their bedroom and on their own, and they have a camera and internet access, they need to have appropriate parental supervision.
“Just because they’re in their bedroom it doesn’t mean they’re not being groomed.”
Girls are seen in around 80 per cent of the images flagged to the IWF, which removed 132,000 webpages last year.
“The problem with the numbers being so huge is we forget every single image is a real child,” Ms Hargreaves said.
“We will not stop until we take down every single image, we will go after it again and again because we owe it to that victim.”
She warned that while the dark web was still being used by paedophiles, material reported to the IWF was on the open internet, adding: “Child sexual abuse is a horrific topic for people to talk about, but as a society we have got to take on board a heavy dose of reality and face up to what’s right in front of us.”
The government’s upcoming Online Harms Bill will include proposals to tackle the “epidemic” of online child sex abuse images.
The Queen’s Speech, which set out Boris Johnson’s legislative agenda, said a new duty of care would be created and enforced for technology companies.
“Ahead of this legislation, the government will publish interim codes of practice on tackling the use of the internet by terrorists and those engaged in child sexual abuse and exploitation,” a government briefing said last month.
“This will ensure companies take action now to tackle content that threatens our national security and the physical safety of children.”
It comes after police and the National Crime Agency warned that it does not have the capacity to investigate or penalise everyone who views child sex abuse images.
Chief Constable Simon Bailey, the National Police Chiefs’ Council lead for child protection, told The Independent officers want to focus on suspects in direct contact with children but are being “overwhelmed” by lower-level image offences.
“We have always been clear that we can’t arrest our way out of this problem,” he said in May.
“We are not able to target the high-risk and high-harm offenders because we are overwhelmed with volume referrals, therefore something has to change.
“That change needs to be a cross-system approach including educating children at home and school about the risks online, ensuring tech companies deliver on their responsibilities to prevent the uploading and sharing of images, and applying conditional cautions to low-risk offenders whereby they have to confront their offending behaviour.”
Subscribe to Independent Premium to bookmark this article
Want to bookmark your favourite articles and stories to read or reference later? Start your Independent Premium subscription today.