TikTok users warned viral suicide video could be inserted into their feeds and hidden in unrelated posts
The video was spread on 4chan, Facebook, and Instagram before making its way to TikTok
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.A video of a man taking his own life is spreading across TikTok, despite attempts to take it down.
The video apparently spread on Facebook and Instagram before being shared on TikTok.
Users report that the distressing footage is being placed inside of other, seemingly unrelated videos, so that it may appear without warning in users’ feeds.
Media reports say the video was streamed live on Facebook on 31 August by a Mississippi man.
It also was spread on the website 4chan, an anonymous messaging forum that has risen in notoriety due to its extreme content.
Now the video has made its way onto TikTok where users report they have seen the video appear, without warning, on their feed.
However, many others have posted content warning people to avoid the video.
These tell users that if they see the opening frame of the video - a white man with a beard sitting at a desk - they should immediately swipe away or close the app.
TikTok said that it was using automated systems that are attempting to stop the video.
“Our systems have been automatically detecting and flagging these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide,” TikTok said in a statement.
“We are banning accounts that repeatedly try to upload clips, and we appreciate our community members who’ve reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family.”
This is not the first instance that a video of a suicide has been spread on social media, nor the first time technology giants have struggled to remove graphic content.
In February 2019, Instagram pledged to remove graphic images of self-harm in its search function.
Between April and June of that year it had to take down 834,000 pieces of content from its site.
In March 2019, when videos of the Christchurch terrorist attack were being spread on social media, Facebook said that it could not remove them quickly from its platform because of their visual similarity to video games.
“If thousands of videos from livestreamed video games are flagged by our systems, our reviewers could miss the important real-world videos, where we could alert first responders to get help on the ground,” Guy Rosen, Facebook VP of integrity, said at the time.
When life is difficult, Samaritans are here – day or night, 365 days a year. You can call them for free on 116 123, email them at jo@samaritans.org, or visit www.samaritans.org to find your nearest branch.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments