Internet giants given one hour deadline to take down terrorist propaganda
Google and Facebook argue European Commission's new time limit too short to allow for effective investigation
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Internet giants Google, Facebook and Twitter are facing renewed pressure to tackle the problem of terrorist propaganda online after the European Commission (EC) gave them just a one hour deadline to remove offensive content from their pages or face penalties.
The EC's demand comes at a time when the major search and social media companies are being urged to do more to censor inappropriate or illegal material posted by users and hosted on their domains.
According to the new recommendations issued on Thursday, leading web companies must move to take down terrorist material, posts that incite hatred or violence, child sexual abuse videos and images, sites trading in illegal goods or counterfeit products and instances of copyright infringement within 60 minutes of their being uploaded.
"Considering that terrorist content is most harmful in the first hours of its appearance online, all companies should remove such content within one hour from its referral as a general rule," the EC said in a statement.
The commission will also ask companies to report back on the degree of co-operation they receive from other organisations in order to determine whether stricter legislation is necessary.
Most online media companies have clear rules in place warning users against publishing hate speech and routinely investigate and remove troubling content as soon as it is reported by users.
However, the major players had previously signed up to a 24-hour timeframe for deleting objectionable content and argue that the new proposal leaves too little time to act.
"Such a tight time limit does not take due account of all actual constraints linked to content removal and will strongly incentivise hosting services providers to simply take down all reported content,” the Computer & Communications Industry Association warned in response.
Facebook has recently changed the way topical content is shown in users' feeds to counter the problem of "fake news" while YouTube announced it was hiring 10,000 new moderators in December to more proactively police clips being run on the site.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments