YouTube surprises Wikipedia with plan to use it to help counter conspiracy videos

YouTube's has come under fire for its role in promoting hoaxes and conspiracies 

Jeremy B. White
San Francisco
Wednesday 14 March 2018 21:46 GMT
Comments
YouTube CEO Susan Wojcicki speaks on stage during a conference in San Jose, California
YouTube CEO Susan Wojcicki speaks on stage during a conference in San Jose, California (REUTERS/Stephen Lam)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

YouTube plans to counter conspiracy theories using Wikipedia, an idea that generated surprise and scepticism from the open-source encyclopaedia.

The Google-owned video giant has come under criticism for its role in disseminating false and extremist content, the site’s algorithms promoting hoaxes and conspiracy theories in the frenzied aftermath of breaking news events like mass shootings.

In an effort to rebut false information, YouTube CEO Susan Wojcicki unveiled a new initiative that will pair videos founded on shaky premises with “information cues”, or text boxes that direct users to third-party sources like Wikipedia.

The announcement caught Wikipedia off guard, with the Wikimedia Foundation saying in a statement that it had not entered a formal partnership with YouTube and was not given advance noticed of Ms Wojcicki’s talk.

Thousands of editors are already at work monitoring and seeking to “combat conspiracies, pseudo-science, fringe theories, and more”, the statement said, lending experience to the kind of fact-checking operation YouTube suggested.

But people who work with Wikipedia had doubts that the platform - which relies on contributors adding and editing material in its vast online repository - was the ideal tool to help fix YouTube’s misinformation problem.

“I don't think YouTube can rely on our irregularly updated *encyclopedia* to solve its ranking algorithm/hate speech issue”, a Wikipedia contributor named Phoebe Ayers wrote on Twitter.

“It's not polite to treat Wikipedia like an endlessly renewable resource with infinite free labor”, she added.

The Wikimedia Foundation’s executive director, Katherine Maher, echoed Ms Ayers’ point about relying on Wikipedia’s volunteer base, saying the site already lacked the level of “support that is critical to our sustainability”, and warning that Wikipedia was not always accurate.

YouTube star Logan Paul issues video apology for hanging man footage

“We don’t want you to blindly trust us”, Ms Maher wrote on Twitter.

A YouTube representative said the new features would roll out in the coming months but did not provide details on how they would work.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in