Surge in paedophiles using virtual reality tech in child abuse image crime in UK

Police recorded 30,925 offences involving obscene images of children in 2021/22, the highest number ever logged by forces in England and Wales

Margaret Davis
Wednesday 22 February 2023 04:17 GMT
Virtual YouTuber brought to tears after being harassed for playing Hogwarts Legacy

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Paedophiles are starting to use virtual reality headsets to view child abuse images, crime records suggest.

Children’s charity the NSPCC obtained data from police forces in England and Wales including details of which social media sites or types of technology were mentioned in reported crimes.

Police recorded 30,925 offences involving obscene images of children in 2021/22, the highest number ever logged by forces in England and Wales.

Among these, a social media or gaming site was recorded in 9,888 cases – including Snapchat 4,293 times, Facebook 1,361; Instagram 1,363 and WhatsApp 547.

Virtual reality was recorded eight times by police forces in crime reports, the first time this technology has been specifically mentioned, the NSPCC said.

The NSPCC is asking for amendments to the Online Safety Bill to create a child safety advocate to represent the interests of children and families.

It also wants changes to the law to mean senior managers of social media sites are held criminally liable if children are exposed to preventable abuse.

Sir Peter Wanless, chief executive of the NSPCC, said: “These new figures are incredibly alarming but reflect just the tip of the iceberg of what children are experiencing online.

“We hear from young people who feel powerless and let down as online sexual abuse risks becoming normalised for a generation of children.

“By creating a child safety advocate that stands up for children and families the Government can ensure the Online Safety Bill systemically prevents abuse.

“It would be inexcusable if in five years’ time we are still playing catch-up to pervasive abuse that has been allowed to proliferate on social media.”

A Government spokesperson said: “Protecting children is at the heart of the Online Safety Bill and we have included tough, world-leading measures to achieve that aim while ensuring the interests of children and families are represented through the Children’s Commissioner.

“Virtual reality platforms are in scope and will be forced to keep children safe from exploitation and remove vile child abuse content. If companies fail to tackle this material effectively, they will face huge fines and could face criminal sanctions against their senior managers.”

A spokesman for Meta, which owns Facebook, Instagram and WhatsApp, said it reports child sexual exploitation to international child protection organisation the National Centre for Missing & Exploited Children.

He added: “This horrific content is banned on our apps, and we report instances of child sexual exploitation to NCMEC.

“We lead the industry in the development and use of technology to prevent and remove this content, and we work with the police, child safety experts and industry partners to tackle this societal issue.

“Our work in this area is never done, and we’ll continue to do everything we can to keep this content off our apps.”

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in