The Independent's journalism is supported by our readers. When you purchase through links on our site, we may earn commission.
TikTok: Huge number of videos removed for election and Covid misinformation - but it’s not the app’s biggest problem
‘Minor safety’ has the app’s highest proactive removal rate, according to the report
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.TikTok has released its latest transparency report - an insight into how the short-form video app deals with removing and filtering content - revealing the scale of misinformation that users attempt to spread on the app.
The company removed nearly 350,000 videos related to misinformation regarding the 2020 US election, and over 50,000 relating to false claims about the coronavirus pandemic.
“In the second half of 2020, 347,225 videos were removed in the US for election misinformation, disinformation, or manipulated media. We worked with fact checkers at PolitiFact, Lead Stories, and SciVerify to assess the accuracy of content and limit distribution of unsubstantiated content. As a result, 441,028 videos were not eligible for recommendation into anyone's For You feed”, TikTok said.
“We further removed 1,750,000 accounts that were used for automation during the timeframe of the US elections. While it's not known if any of the accounts were used specifically to amplify election related content, it was important to remove this set of accounts to protect the platform at this critical time.”
As well as removing videos, the company assembled election guides with information gathered from the National Association of Secretaries of State, the US Election Assistance Commission, The Associated Press, and other “trusted organizations”, the company says. This guide was visited 17,995,580 times.
TikTok has similar resources for its COVID-19 information page, which the company says was viewed 2,625,049,193 times globally in the second half of last year. The company also removed 51,505 videos for promoting COVID-19 misinformation.
“Of those videos, 86 per cent were removed before they were reported to us, 87 per cent were removed within 24 hours of being uploaded to TikTok, and 71 per cent had zero views”, the report states.
While these areas have been particularly pressing recently, for obvious reasons. TikTok was under pressure from the US government to sell to Walmart or Oracle due to claimed national security concerns relating to China from the Trump government which were unsubstantiated at best and baffling at worst.
The deal has apparently been shelved indefinitely by the Biden administration, although it is apparently evaluating risks to US data in a review that includes TikTok.
However, misinformation is not the type of content that is most often removed.
“Minor safety” has TikTok’s highest proactive removal rate, at 97.1 per cent, followed by illegal activities and regulated goods, suicide and self-harm content, and violent and graphic content - all of which are above a 90 per cent proactive removal rate.
TikTok says that 36 per cent of content removed violated its minor safety policy, an increase from 22.3 per cent in the first half of the year.
Some communities on the video app have relationships with harmful child content. A recent investigation into the video communication website Omegle found that its popularity has spiked in recent years due to the virality of content posted from it to TikTok.
In January 2021, TiKTok changed its default privacy settings for young people’s accounts. Users aged between 13 and 15 will need to manually approve followers, and is blocking other people from downloading videos created by younger users.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments