Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

TikTok users warned viral suicide video could be inserted into their feeds and hidden in unrelated posts

The video was spread on 4chan, Facebook, and Instagram before making its way to TikTok

Adam Smith
Tuesday 08 September 2020 11:17 BST
Comments
La demanda se centra en el 'Modo de pantalla verde' de TikTok, donde los usuarios graban varios videos sincronizados con una pista de audio.
La demanda se centra en el 'Modo de pantalla verde' de TikTok, donde los usuarios graban varios videos sincronizados con una pista de audio.

A video of a man taking his own life is spreading across TikTok, despite attempts to take it down.

The video apparently spread on Facebook and Instagram before being shared on TikTok.

Users report that the distressing footage is being placed inside of other, seemingly unrelated videos, so that it may appear without warning in users’ feeds.

Media reports say the video was streamed live on Facebook on 31 August by a Mississippi man.

It also was spread on the website 4chan, an anonymous messaging forum that has risen in notoriety due to its extreme content.

Now the video has made its way onto TikTok where users report they have seen the video appear, without warning, on their feed.

However, many others have posted content warning people to avoid the video

These tell users that if they see the opening frame of the video - a white man with a beard sitting at a desk - they should immediately swipe away or close the app.

TikTok said that it was using automated systems that are attempting to stop the video.

“Our systems have been automatically detecting and flagging these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide,” TikTok said in a statement.

“We are banning accounts that repeatedly try to upload clips, and we appreciate our community members who’ve reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family.”

This is not the first instance that a video of a suicide has been spread on social media, nor the first time technology giants have struggled to remove graphic content.

In February 2019, Instagram pledged to remove graphic images of self-harm in its search function.

Between April and June of that year it had to take down 834,000 pieces of content from its site. 

In March 2019, when videos of the Christchurch terrorist attack were being spread on social media, Facebook said that it could not remove them quickly from its platform because of their visual similarity to video games.

“If thousands of videos from livestreamed video games are flagged by our systems, our reviewers could miss the important real-world videos, where we could alert first responders to get help on the ground,” Guy Rosen, Facebook VP of integrity, said at the time.

When life is difficult, Samaritans are here – day or night, 365 days a year. You can call them for free on 116 123, email them at jo@samaritans.org, or visit www.samaritans.org to find your nearest branch.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in