Meta to restrict more content for teenagers on Facebook and Instagram
The social media giant said it was taking steps to give younger users a more age-appropriate experience.
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Facebook and Instagram are to start hiding more types of content for teenagers as part of an effort to better protect younger users from harmful material online.
As part of the changes, teenager users will no longer see posts from others discussing their personal struggle with thoughts of self-harm or suicide – even if they follow the user in question.
Meta said it was placing all under 18s into the most restrictive content control settings categories on Instagram and Facebook, and was restricting additional terms in Search on Instagram.
This setting already applies to new users who join the site, but is now being expanded to all teenagers using the apps.
Meta said the settings make it more difficult for people to come across potentially sensitive content or accounts across the apps, including in the Explore sections.
The new measures will be rolled out on the two platforms over the coming months.
On self-harm and suicide content on Instagram, Meta said it was “focused on ways to make it harder to find”, while also offering support to those who post about it.
“While we allow people to share content discussing their own struggles with suicide, self-harm and eating disorders, our policy is not to recommend this content and we have been focused on ways to make it harder to find,” the social media firm said in a blog post.
“Now, when people search for terms related to suicide, self-harm and eating disorders, we’ll start hiding these related results and will direct them to expert resources for help.
“We already hide results for suicide and self harm search terms that inherently break our rules and we’re extending this protection to include more terms. This update will roll out for everyone over the coming weeks.”
In addition, Meta said it would also begin sending notifications to teens, reminding them to check and update their privacy settings.
In response to the measures, Andy Burrows, adviser to online safety group the Molly Rose Foundation, said Meta’s changes were welcome, but failed to address the issue.
He added: “Our recent research shows teenagers continue to be bombarded with content on Instagram that promotes suicide and self-harm and extensively references suicide ideation and depression.
“While Meta’s policy changes are welcome, the vast majority of harmful content currently available on Instagram isn’t covered by this announcement, and the platform will continue to recommend substantial amounts of dangerous material to children.
“Unfortunately this looks like another piecemeal step when a giant leap is urgently required.”
The charity claimed that much of the harmful content it identified came from meme-style accounts and were not covered by Meta’s announcement.