YouTube trolls target children by uploading animated shows with spliced-in clips promoting self-harm
YouTube said it gave person responsible for editing and re-uploading disturbing videos a 'strike' for violating company policy
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Internet trolls are targeting children on the YouTube Kids app by re-uploading popular shows with spliced-in clip encouraging self-harm.
The clip in question features a man jokingly instructing viewers how to cut themselves.
In the past year, YouTube had to remove at least two videos after parents discovered the clip in question while watching cartoons with their kids.
The controversial video clip is of former YouTuber Filthy Frank performing his self-harm skit, BuzzFeed reported. These clips are often inserted in Splatoon-like cartoons and re-uploaded onto the video streaming app made for children. (Splatoon is a third-person shooter video game designed by Nintendo.)
“Remember, kids, sideways for attention, longways for results. End it,” the man in the clip says, while mimicking cutting his arms.
Originally, the video clip was filmed in front of a green screen. But it was reportedly edited out by an internet troll to make it seem that Filthy Frank was living in a cartoon fantasy world.
It’s not a coincidence for internet trolls to specifically splice-in self-harm clips in episodes of Splatoon-like cartoons. The majority of Splatoon cartoons uploaded onto YouTube earn millions of views.
The identity of the user who edited in the self-harm clips and uploaded them onto YouTube Kids remains unidentified.
However, YouTube told BuzzFeed News that the account user responsible for uploading the distasteful videos has been given a strike. In other words, despite targeting and encouraging impressionable kids to cut themselves, the video-sharing company will continue to allow the user to upload content on its platform.
If they continue to violate the company policies, YouTube said it will terminate their channel.
The first time Dr. Free Hess, a paediatrician and mother residing in Gainesville, Florida, saw one of the edited videos was in July 2018. Her friend, also a mother, showed her the video—featuring Filthy Frank’s disturbing self-harm clip—on the YouTube Kids app that her child watched.
“This clip is compiled in with a bunch of ridiculously stupid and outrageously offensive stuff on a green screen,” Ms Hess told BuzzFeed News.
The video had over 600,000 views before YouTube took it down.
Ms Hess wrote about the shocking video on her blog, warning parents about the inappropriate and dangerous content their children could be exposed to when watching videos without supervision.
“Her son had watched this particular cartoon on various channels and YouTube Kids, and how often do parents watch the whole way through?” she added.
Ms Hess and several other parents repeatedly reported the video on YouTube, but were not able to have it removed from the site until she directly contacted someone at YouTube.
Filthy Frank, whose real name is George Miller, has about 6.2m followers on his YouTube channel. His videos frequently features cameo appearances from other popular, and sometimes controversial internet personalities, including Felix Kjellberg, the Swedish YouTuber commonly known as PewDiePie and for making rape jokes, racial slurs, and anti-semitic remarks.
In his YouTube’s “About” page, Filthy Frank says the character is a representation of the worst possible traits, values and attitudes one could have as a person.
“Filthy Frank is the embodiment of everything a person should not be. He is anti-PC, anti-social, and anti-couth,” he wrote on the page.
Before adding, “there is no denying that the show is terribly offensive, but this terrible offensiveness is a deliberate and unapologetic parody of the whole social media machine and a reflection of the human microcosm that that social media is. OR MAYBE I’M JUST F****** RETARDED.”
In one of his videos, Filthy Frank is depicted as a One Direction fan killing himself. He hasn't posted a video in over a year.
Filthy Frank did not respond to The Independent’s request for comment.
The re-uploaded videos preying on children with horrifying messages to self-harm are still out there. Ms Hess said she encountered another re-uploaded video featuring Filthy Frank promoting cutting and recorded it with her phone. While it was still online, Ms Hess said she saw comments demanding the video to be taken down dated back several months.
It was taken down only a few days ago, BuzzFeed News reported.
Ms Hess said she wants YouTube to be held “personally responsible” in pro-actively deleting these videos before children are exposed to them.
“I would like them to recognise the dangers associated with this for our children [and] to be taking parents’ concerns seriously and have a better process for removing these things when reported,” she said.
“It’s not happening fast enough, and it’s not taken seriously enough.”
YouTube did not yet respond to the Independent’s request for comment, but a spokesperson told BuzzFeed News that the San Bruno-based company has policies barring content encouraging self-harm.
The spokesperson said YouTube, which depends on users and automation to flag inappropriate content, “works hard to ensure YouTube is not used to encourage dangerous behaviour and we have strict policies that prohibit videos which promote self-harm”.
It also said that it takes down millions of videos and channels violating its policies and removes “the majority of these videos before they have any views”.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments