The Independent's journalism is supported by our readers. When you purchase through links on our site, we may earn commission. 

TikTok will now tell users exactly why it has removed their videos amid confusion over disappearing posts

The company is rolling out the change globally

Adam Smith
Friday 23 October 2020 13:13 BST
Comments
(Hello I’m Nik)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

TikTok has updated its community guidelines so that people who have had their videos removed from the platform will now be informed why.

“For the past few months, we've been experimenting with a new notification system to bring creators more clarity around content removals”, TikTok said in a blog post

“Our goals are to enhance the transparency and education around our Community Guidelines to reduce misunderstandings about content on our platform”

The viral video app company claimed that its enforcement actions has reduced the rate of repeat violations, and visits to read the company’s Community Guidelines have nearly tripled.

It also says that it has seen a 14 per cent reduction in appeals from users to reinstate videos.

The company says it is now rolling out the change globally.

As well as this update, TikTok is detecting content that might be related to self-harm or suicide.

When it does, the company will provide links to specialist organisations, such as Befrienders Worldwide, and a list of frequently-asked-questions to try and direct users to beneficial resources.

In its transparency report  from July, TikTok said that it had removed over 49 million videos in the six months prior.

This accounts for less than one per cent of all the videos uploaded to the site.

Over a quarter of the videos were taken down for adult nudity and sexual acts, while others were removed for “depicting harmful, dangerous, or illegal behaviour by minors” such as  alcohol or more serious narcotics.

Recently, the company has had to crack down on QAnon-related content, banning accounts that promote the conspiracy theory and making it harder to find that content across search and hashtags.

"Content and accounts that promote QAnon violate our disinformation policy and we remove them from our platform," a TikTok spokesperson said in a statement.

The QAnon conspiracy theory states that president Donald Trump is fighting a Satan-worshipping cabal of paedophiles who are plotting to enslave the world.

One in four Britons believe in conspiracy theories related to the movement.

TikTok is also taking a stronger stance against antisemitic and Islamophobic content, removing videos and hashtags that spreads misinformation and hurtful stereotypes.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in