TikTok bans under 16s from messaging each other as part of new safety measures
'Despite its potential for good, we understand the potential for misuse,' the viral video app says
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.TikTok users under the age of 16 will soon no longer be able to send or receive direct messages through the hugely popular video-sharing app.
From 30 April, new online safety measures introduced by the Chinese-owned app will stop children from using the Direct Messaging feature to contact other users.
TikTok’s head of safety Cormac Keenan explained that the ban is aimed at “going one step further” with its existing restrictions, which already prevent users from receiving unsolicited messages from people that are not friends with them in the app.
“As part of our commitment to improve safety on TikTok, we are introducing new restrictions on who can use our Direct Messaging feature,” he said.
“Direct Messaging is an amazing tool that enables people to make new friends and connections no matter where they are in the world. But despite its potential for good, we understand the potential for misuse.”
Andy Burrows, head of online child safety at the NSPCC praised the “proactive” step firm and called for other social media firms to do the same.
“This is a bold move by TikTok as we know that groomers use direct messaging to cast the net widely and contact large numbers of children,” he said.
“Offenders are taking advantage of the current climate to target children spending more time online, but this shows proactive steps can be taken to make sites safer and frustrate groomers from being able to exploit unsafe design choices.”
Since launching in 2016, TikTok has been downloaded more than 1.5 billion times, according to figures from app analytics firm Sensor Tower.
Last year, it was one of the most-downloaded apps in the world, proving particularly popular with younger demographics.
Its massive popularity has brought with it increased scrutiny on how it is protecting young people from privacy breaches and online abuse.
In February, TikTok announced a variety of safety measures that allow parents to control what their children see on the platform.
The family safety mode links childrens’ accounts to their parents’, meaning controls on screen time and the type of content that appears on their feed can be enforced.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments