The Independent's journalism is supported by our readers. When you purchase through links on our site, we may earn commission. 

Facebook could be letting inappropriate pictures of children through moderation, report alleges

Adam Smith
Friday 01 April 2022 20:16 BST
Comments
Facebook TikTok Rivalry
Facebook TikTok Rivalry (Copyright 2021 The Associated Press. All rights reserved.)

Your support helps us to tell the story

As your White House correspondent, I ask the tough questions and seek the answers that matter.

Your support enables me to be in the room, pressing for transparency and accountability. Without your contributions, we wouldn't have the resources to challenge those in power.

Your donation makes it possible for us to keep doing this important work, keeping you informed every step of the way to the November election

Head shot of Andrew Feinberg

Andrew Feinberg

White House Correspondent

Meta, the parent company of Facebook and Instagram, has reportedly told moderators to “err on the side of an adult” when moderating pictures or videos of young people.

Antigone Davis, head of safety for Meta, told the New York Times that the policy comes from privacy concerns for people who post sexual images of adults. “The sexual abuse of children online is abhorrent,” Ms. Davis said.

However, there are millions of photos and videos moderated as they are uploaded to Facebook, and the company makes 27 million reports of suspected child abuse in 2021. Yet experts still believe that moderators are likely missing some minors.

A training document created for moderators at Accenture, a consulting firm used by Facebook, allegedly says that some mods reportedly “bump up” adolescents to young adults. The Independent has reached out to Accenture for comment.

Content moderators that worked for Meta reportedly said that they encountered sexual images every day that would be affected by this policy and would face negative performance reviews if they made too many erroneous reports.

“They were letting so many things slide that we eventually just didn’t bring things up anymore,” said one former moderator. “They would have some crazy, extravagant excuse like, ‘That blurry portion could be pubic hairs, so we have to err on the side of it being a young adult.’”

Facebook, and other technology companies use the ‘Tanner stages’ to determine the stages of puberty; this is a tool developed by paediatrician Dr James Tanner in the 1960s, but is not designed to determine age. Ms Davis said that this was “ just one factor in estimating age”, however. This could include muscle development or the child’s face.

Companies like Meta must report “apparent” child sexual abuse material, but the law does not define the word “apparent”. Ms. Davis said it was unclear if the law would protect Meta if it erroneously reported an image.

Apple, Snap, which owns Snapchat, and TikTok told the Times that they take the opposite approach of Meta – reporting any sexual image in doubt.

“We report more material to the National Center for Missing and Exploited Children than literally all other tech platforms combined, because we are the most aggressive. What’s needed is for lawmakers to establish a clear and consistent standard for all platforms to follow”, a Meta spokesperson said.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in