Meta scrambles to fix Instagram algorithm connecting ‘vast paedophile network’

‘Suggested for you’ feature reportedly linked users to child sexual abuse material

Anthony Cuthbertson
Thursday 08 June 2023 11:13 BST
Comments
Instagram’s logo displayed on a tablet screen on 18 October, 2021
Instagram’s logo displayed on a tablet screen on 18 October, 2021 (Getty Images)

Support truly
independent journalism

Our mission is to deliver unbiased, fact-based reporting that holds power to account and exposes the truth.

Whether $5 or $50, every contribution counts.

Support us to deliver journalism without an agenda.

Louise Thomas

Louise Thomas

Editor

Meta has launched an investigation into reports that Instagram is promoting child sexual abuse material through its algorithm.

Facebook’s parent company set up a taskforce to investigate the claims after the Stanford Internet Observatory (SIO) said it found “large-scale communities” sharing paedophilia content on the platform.

The SIO said it discovered the child sexual abuse material (CSAM) following a tip from the Wall Street Journal, whose report on Wednesday detailed how Instagram’s recommendation algorithm helped connect a “vast pedophile network” of sellers and buyers of illegal material.

Instagram’s ‘suggested for you’ feature also linked users to off-platform content sites, according to the report, with the SIO describing the site as “currently the most important platform” for these networks.

“Instagram has emerged as the primary platform for such networks, providing features that facilitate connections between buyers and sellers,” Stanford’s Cyber Policy Center wrote in a blog post.

“Instagram’s popularity and user-friendly interface make it a preferred option for these activities.”

Instagram users were able to find child abuse content through explicit hashtags like #pedowhore, which have since been blocked by Instagram.

“Child exploitation is a horrific crime,” a Meta spokesperson said.

“We’re continuously investigating ways to actively defend against this behaviour, and we set up an internal task force to investigate these claims and immediately address them.”

Meta said that it had already destroyed 27 paedophile networks over the past two years on Instagram, as well as removed 490,000 accounts violating child safety policies in January alone.

Other social media platforms hosting this type of content were also identified by the SIO, though to a much lesser extent.

The SIO called for an industry-wide initiative to limit production, discovery and distribution of CSAM, while also urging companies to devote more resources to proactively identifying and stopping abuse.

“Given the multi-platform nature of the problem, addressing it will require better information sharing about production networks, countermeasures, and methods for identifying buyers,” the organisation said.

“SIO hopes that this research aids industry and non-profits in their efforts to remove child sexual abuse material from the internet.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in