YouTube set to hire more staff to review extremist video content

Over 10,000 people will monitor content uploaded to YouTube in 2018

Rishika Chatterjee
Tuesday 05 December 2017 09:07 GMT
Comments
Advertisers, regulators and advocacy groups express ongoing concern over whether YouTube’s policing of its service is sufficient
Advertisers, regulators and advocacy groups express ongoing concern over whether YouTube’s policing of its service is sufficient (Reuters)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Alphabet’s YouTube said on Monday it plans to add more people next year to identify inappropriate content as the company responds to criticism over extremist, violent and disturbing videos and comments.

YouTube has developed automated software to identify videos linked to extremism and now is aiming to do the same with clips that portray hate speech or are unsuitable for children. Uploaders whose videos are flagged by the software may be ineligible for generating ad revenue.

But amid stepped-up enforcement, the company has received complaints from video uploaders that the software is error-prone.

Adding to the thousands of existing content reviewers will give YouTube more data to supply and possibly improve its machine learning software.

The goal is to bring the total number of people across Google working to address content that might violate its policies to over 10,000 in 2018, YouTube chief executive Susan Wojcicki said in one of a pair of blog posts on Monday.

“We need an approach that does a better job determining which channels and videos should be eligible for advertising,” she said. “We’ve heard loud and clear from creators that we have to be more accurate when it comes to reviewing content, so we don’t demonetise videos by mistake.”

In addition, Ms Wojcicki said the company would take “aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether.”

The moves come as advertisers, regulators and advocacy groups express ongoing concern over whether YouTube’s policing of its service is sufficient.

YouTube is reviewing its advertising offerings as part of its response and it teased that its next efforts could further change requirements to share in ad revenue.

YouTube this year updated its recommendation feature to spotlight videos users are likely to find the most gratifying, brushing aside concerns that such an approach can trap people in bubbles of misinformation and like-minded opinions.

Reuters

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in