Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Facebook, Google and Twitter criticised over failure to 'answer basic questions' about coronavirus and 5G by MPs

Tech companies accused of not doing enough to stop spread of false stories and hoaxes

Andrew Griffin
Thursday 30 April 2020 17:32 BST
Comments
People play football in front of graffiti reading ‘Stop 5G Paranoia’ painted on a wall in east London on 19 April 2020
People play football in front of graffiti reading ‘Stop 5G Paranoia’ painted on a wall in east London on 19 April 2020

Facebook, Google and Twitter have been accused of failing to "answer basic questions" on the spread of false stories and hoaxes about coronavirus by MPs.

Representatives from the three companies appeared over video link in front of the Digital, Culture, Media and Sport (DCMS) sub-committee on online harms and disinformation, where they were asked in particular about the spread of false stories in relation to the outbreak.

But MPs said they had failed to give satisfactory accounts of what they were doing to stop the spread of conspiracy theories such as those linking 5G with coronavirus, as well as other issues.

Committee chairman Julian Knight said MPs would be writing to each of the tech giants to voice their "displeasure" at a "lack of answers" given on content moderation on their platforms.

The appearance of the internet giants comes as they, governments and other organisations continue to attempt to stop the spread of disinformation linked to Covid-19, which has seen incidents of fake cures touted online and phone masts attacked after a debunked conspiracy theory spread which claimed 5G technology was linked to the outbreak.

Facebook's UK public policy manager Richard Earley was criticised for not being able to confirm the number of content moderators the firm currently had reviewing explicit material flagged to the platform.

Conservative MP Damian Hinds raised the issue after a report from the NSPCC last week which highlighted concerns about the safety of children online after it suggested the Covid-19 lockdown had led to staff cuts at social media platforms and therefore fewer reviewers able to find and remove child exploitation material.

Mr Earley said Facebook had taken steps to "minimise any negative impact on our ability to review content", including moving responsibility for the most serious content review subjects - such as child abuse material and self-harm content - to its available full-time employees and putting in place systems that allowed other contracted moderators to work from home.

But he admitted some volunteers were also being used for moderation.

"We've also had a large number of employees who don't even review content in their daily roles, volunteering to step forward and help make sure we did not see significant negative impacts on that queue," he said.

But when pressed by the committee on whether Facebook had the same number of moderators working as before the pandemic, or fewer, Mr Earley said he was unable to answer because the situation was changing each day.

Mr Hinds urged Facebook to respond in writing on the issue, while Mr Knight said it appeared that none of the witnesses had been supplied with "genuine, hard information on how you are specifically going about tackling Covid disinformation".

Andy Burrows, NSPCC associate head of child safety online policy said Facebook's admission was "deeply troubling".

"Although no-one could have foreseen these circumstances, the reality is platforms for years have failed to protect children from abuse and harmful content," he said.

"Now those cracks are being exposed and exacerbating a perfect storm for child abuse.

"This goes to show the urgent need for the duty of care legislation which would force tech firms to protect children on their sites and tackle years of industry inaction head-on."

Following a number of fractious exchanges between MPs on the committee and the tech firms' representatives, Mr Knight said he had not heard "any facts" from Facebook, while Twitter's Katy Minshall was accused of using "pre-prepared" remarks rather than attempting to answer questions.

Following an exchange between Google's public policy and government relations manager, Alina Dimofte, and SNP MP John Nicolson on online gambling, Mr Knight expressed his frustration at the number of follow-up letters the committee would need to write because the three witnesses had been "seemingly unable to answer quite basic questions".

"We will be writing to all the organisations and frankly we will be expressing our displeasure at the quality of the answers - well a lack of answers - that we've received today and will be seeking further clarity," the committee chairman said.

Additional reporting by Press Association

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in