Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

YouTube CEO won't commit to banning QAnon content

QAnon adherents have already hijacked social causes with their ideology.

Graig Graziosi
Monday 12 October 2020 22:34 BST
Comments
House passes resolution condemning Qanon
Leer en Español

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Susan Wojcicki, the CEO of YouTube would not commit to banning QAnon content on the video giant.  

Ms Wojcicki made the comments during a guest appearance on a CNN podcast.  

She said that YouTube was "looking very closely at QAnon" accounts and said the platform has "implemented a large number of different policies that have helped maintain that in a responsible way."  

At the moment, YouTube only removes QAnon content when it is in violation of of other policies prohibiting hateful or harassing content. The video platform claims it tries to sideline questionable content that does not cross the line  into flagrant violations of its rules.  

YouTube has been criticised in the past for its algorithm's tendency to push viewers towards radical right-wing and alt-right videos in its suggested video section.  

The company said that it has since changed its algorithm so that it no longer pushes users towards increasingly extremist content. It claimed that there has been an 80 per cent drop in views from recommendations to prominent QAnon video channels.  

The company also said that it has already removed "tens of thousands of Q-related videos, and terminated hundreds of Q-related channels for violating our hate and harassment policies since 2018."  

Facebook, Instagram and Twitter have taken actions to curb QAnon content in recent weeks. Twitter blocked QAnon related content from its Trending section, and Facebook and Instagram deleted the most prominent Q-related accounts on the platforms.  

Ms Wojcicki said if the video platform was going to ban Q-related content and QAnon groups, it would first need to devise very specific language defining what exactly constitutes such a group.  

“I think with every policy, it has to be defined very clearly. Like what does that exactly mean—'a QAnon group'—exactly?” Ms Wojcicki said. “That’s a kind of thing that we would need to put in terms of the policies and make sure that we were superclear. So we are continuing to evolve our policies here. It’s not that we’re not looking at it or that we don’t want to make changes.”

Ms Wojcicki and even her proactive peers in the tech industry may have done a little too little a little too late. QAnon adherents have intentionally begun to downplay the more fanciful and bizarre elements of their belief system - like secret extrajudicial executions of high profile Democrats being replaced with clones and Democrat harvesting of children's fear for a substance called "adrenochrome" - and focus instead on camouflaging their dangerous ideology in other social movements.  

A recent anti-human trafficking campaign, #SavetheChildren, was once such incident. Though the outward project - fighting the trafficking of children - was inoffensive, the information that was shared using the hashtag was largely inaccurate and all fell in line with QAnon beliefs.  

The user who posts as Q told supporters to "drop all references re: 'Q' 'Qanon' etc. to avoid ban/termination," undermining any efforts to paint a clear picture of what exactly constitutes a Q-Anon group.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in