Suicide tips hidden in YouTube and YouTube Kids videos, paediatrician warns

Clips depicting school shootings and instructions on self-harm allegedly found spliced in videos on children’s app

Lindsey Bever
Monday 25 February 2019 18:05 GMT
Comments
YouTube has struggled with how to keep the platform free from material that is damaging to children
YouTube has struggled with how to keep the platform free from material that is damaging to children (Getty iStock)

A US paediatrician has warned parents about videos showing children how to kill themselves found YouTube and YouTube Kids.

Free Hess learned about the chilling videos over the summer when another mum spotted one on YouTube Kids.

She said that minutes into the clip from a children's video game, a man appeared on the screen – giving instructions on how to commit suicide.

“I was shocked,” Ms Hess said, noting that since then, the scene has been spliced into several more videos from the popular Nintendo game “Splatoon” on YouTube and YouTube Kids.

Ms Hess, from Ocala, Florida, has been blogging about the altered videos and working to get them taken down amid an outcry from parents and child health experts, who say such visuals can be damaging to children.

One on YouTube shows a man pop into the frame. “Remember, kids,” he begins, holding what appears to be an imaginary blade to the inside of his arm. “Sideways for attention. Longways for results.”

“I think it's extremely dangerous for our kids,” Ms Hess said about the clips Sunday in a phone interview with the Washington Post.

“I think our kids are facing a whole new world with social media and internet access. It's changing the way they're growing, and it's changing the way they're developing. I think videos like this put them at risk.”

A recent YouTube video appears to include a spliced-in scene showing internet personality Filthy Frank.

It is unclear why he was edited into these clips, but his fans have been known to put him in memes and other videos. There is a similar video on his channel filmed in front of a green screen, but the origins and context of the clip in question are not clear.

Andrea Faville, a spokeswoman for YouTube, said in a written statement that the company works to ensure that it is “not used to encourage dangerous behaviour and we have strict policies that prohibit videos which promote self-harm.”

“We rely on both user flagging and smart detection technology to flag this content for our reviewers,” Ms Faville added.

“Every quarter we remove millions of videos and channels that violate our policies and we remove the majority of these videos before they have any views. We are always working to improve our systems and to remove violative content more quickly, which is why we report our progress in a quarterly report and give users a dashboard showing the status of videos they've flagged to us.”

The videos come amid mounting questions about how YouTube, the world's largest video-sharing platform, monitors and removes problematic content.

YouTube has long struggled with how to keep the platform free from such material – removing hateful and violent videos, banning dangerous pranks and cracking down on child sexual exploitation.

Last month, YouTube announced that it was rebuilding its recommendation algorithm to prevent it from prompting videos that include conspiracy theories and other bogus information, though the videos would remain on the site.

Ms Hess said she has been writing about the distressing video clips on her blog, PediMom, to raise awareness and to get them removed from the platform.

Earlier this month, she found a second one – this time on YouTube. She recorded it, wrote about it and reported the content to the video-sharing platform, she said. The video was taken down.

Another version was reposted on 12 February, receiving more than 1,000 views before it, too, was removed from the site.

Ms Hess said the doctored “Splatoon” videos are not the only ones pushing dark and potentially dangerous content on social media platforms, particularly on YouTube Kids.

In a blog post last week, Ms Hess alerted other parents to numerous concerning videos she said she found on the app – a “Minecraft” video depicting a school shooting, a cartoon centred on human trafficking, one about a child who committed suicide by stabbing and another who attempted to commit suicide by hanging.

Nadine Kaslow, a past president of the American Psychological Association, told the Washington Post that it is a “tragic” situation in which “trolls are targeting kids and encouraging kids to kill themselves.”

Ms Kaslow, who teaches at Emory University's school of medicine, said that some children may ignore the grim video content but that others, particularly those who are more vulnerable, may be drawn to it.

She said such videos can cause children to have nightmares, trigger bad memories about people close to them who have killed themselves or even encourage them to try it, though some of them may be too young to understand the consequences.

Ms Kaslow said parents should monitor what their children do online and tech companies should ensure such content is removed. Still, she said, it's not enough.

“I don't think you can just take them down,” she said about the videos. “For children who have been exposed, they've been exposed. There needs to be messaging – this is why it's not OK.”

Support free-thinking journalism and attend Independent events

Though parents should talk to their children about the videos, Ms Kaslow said, YouTube Kids also should address the issue, explaining to children what the videos were and why children should never harm themselves.

She added that there should be “serious consequences” for those who had a hand in the videos, noting that it was “very worrisome” that they were targeting children.

According to the Centres for Disease Control and Prevention, risk factors associated with suicide may include mental disorders such as clinical depression, previous suicide attempts, a barrier to accessing mental health treatment, physical illness and feelings of hopelessness or isolation.

Washington Post

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in