TikTok says coordinated attack behind suicide clip uploads
TikTok says video of a man apparently taking his own life that circulated on its platform appears to have been spread deliberately by a group working together
Your support helps us to tell the story
This election is still a dead heat, according to most polls. In a fight with such wafer-thin margins, we need reporters on the ground talking to the people Trump and Harris are courting. Your support allows us to keep sending journalists to the story.
The Independent is trusted by 27 million Americans from across the entire political spectrum every month. Unlike many other quality news outlets, we choose not to lock you out of our reporting and analysis with paywalls. But quality journalism must still be paid for.
Help us keep bring these critical stories to light. Your support makes all the difference.
TikTok says a video of a man apparently taking his own life that circulated on its platform was spread deliberately by a group of users working together.
The company found evidence of a “coordinated attack" when it investigated why the video was suddenly appearing on the popular short-video sharing app, a TikTok executive told a British parliamentary committee Tuesday.
TikTok scrambled earlier this month to remove clips of the man shooting himself with a gun, raising concerns about the platform's ability to stop harmful content from reaching its users, many of whom are teens.
Theo Bertram, TikTok's European director of public policy, said there was a huge spike in the number of clips uploaded to TikTok about a week after the original video was livestreamed on Facebook.
“There's evidence of a coordinated attack,” Bertrand said. “Through our investigations, we learned that groups operating on the dark web made plans to raid social media platforms including TikTok, in order to spread the video across the internet. What we saw was a group of users who were repeatedly attempting to upload the video to our platform.” The dark web is a part of the internet accessible only through anonymity-providing software.
The users were “splicing it, editing it and cutting it in different ways" and then making new accounts to help spread it, he said.
TikTok users usually look through their own feed or use hashtags to find videos. But these users were clicking on account profiles, apparently anticipating that they would be posting the suicide clip, which is an unusual way to find videos, Bertram said. He gave few other details.
The company wrote Monday to nine other tech platforms proposing that they warn each other about violent and graphic content on their own services.
Bertram's comments came as TikTok said in its latest transparency report that it took down 104.5 million videos for violating its guidelines or terms of service during the first six months of the year. That's less than 1% of the total number of videos uploaded for that period.
TikTok's Chinese owner, ByteDance, is fighting pressure in some of its major markets. In the U.S., TikTok faces a ban later this month from smartphone app stores, followed by a broader ban in November unless ByteDance can persuade U.S. officials it can resolve national security concerns.