Isis videos targeted by artificial intelligence that can detect propaganda before it's uploaded

Developers hope to combat threat of lone-wolf attacks by 'cutting propaganda off at the source'

Lizzie Dearden
Home Affairs Correspondent
Tuesday 13 February 2018 01:12 GMT
Comments
Marc Warner describes system 'SherlockML' that can stop ISIS videos being uploaded

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Artificial intelligence technology that can detect Isis videos and prevent them from being uploaded is being released to stop the spread of the “poisonous” material.

Developers funded by the Home Office are sharing their software for free with any website or app in the world in the hope it will make the terrorist group’s propaganda harder to access and share.

Tests suggest it can detect 94 per cent of Isis videos and makes so few mistakes that a single person could moderate borderline cases for the whole of YouTube.

Dr Marc Warner, chief executive of ASI Data Science, told The Independent the technology’s success depends on how many companies build it into their systems but “we hope that this can play its part in removing extremist content from the web”.

“Lone-wolf attacks are hard to spot with conventional surveillance – it is a difficult problem if someone is radicalised in their bedroom,” he added.

“The way to fight that is to cut the propaganda off at the source. We need to prevent all these horrible videos ever getting to the sort of people who can be influenced by them.”

Security services have been sounding intensifying warnings over online or “remote” radicalisation amid a record number of terror arrests in the UK.

An image from Isis’s Dabiq propaganda magazine. Isis’s propaganda output has been unprecedented compared with other terrorist groups, including high-quality videos and magazines in multiple languages
An image from Isis’s Dabiq propaganda magazine. Isis’s propaganda output has been unprecedented compared with other terrorist groups, including high-quality videos and magazines in multiple languages

A report by the Independent Reviewer of Terrorism Legislation said the phenomenon made threats “acutely difficult to spot”, in the wake of five attacks in London and Manchester.

Andrew Parker, the head of MI5, has said companies have an “ethical responsibility” to help confront an unrelenting threat from people who can “accelerate from inception to planning to action in just a handful of days”.

Isis propaganda has been linked to numerous plots in Britain, as well as driving the exodus of more than 800 men, women and children to its self-declared “caliphate”.

The terrorist group’s sophisticated output has ranged from gory execution videos to attack instructions, bomb-making guides, calls to arms, ideological teachings and rosy depictions of life inside the so-called Islamic State.

More than 1,300 videos released by its central media office since 2014 were analysed by ASI, which identified “subtle signals” that can be used to identify new Isis videos before they are published.

The indicators used by the “advanced machine learning” system are specific to Isis and intend to avoid problems created by YouTube’s broad-brush efforts, which resulted in the removal of evidence of Syrian war crimes.

The tool can be integrated into the upload process of any platform, with the Home Office hoping to reach smaller companies who are unable to fund the huge removal operations mounted by the likes of Facebook and YouTube.

An Isis propaganda video showing boys being rewarded by jihadis for memorising the Quran in Syria
An Isis propaganda video showing boys being rewarded by jihadis for memorising the Quran in Syria

Research shows that Isis supporters used 400 different platforms to spread its material in 2017, with 145 used for the first time between July until December amid intensifying crackdowns by tech giants.

“Google and Facebook can’t solve this problem alone ... it’s a far wider problem,” Dr Warner said. “We are trying to eliminate this horrific content across the entire web and not just on specific platforms.”

John Gibson, the director of data science services at ASI, acknowledged that the software will not prevent Isis uploading videos to its own websites – which are frequently taken down – or on the encrypted messaging app Telegram.

But he said it could make Isis propaganda “harder to find” and share with others.

The Home Office paid ASI – which has previously used data to predict bacon sandwich sales on easyJet flights and make buses run on time – £600,000 to create the tool over just five months from the project’s inception in September.

Officials are now looking at the possibility of expanding machine learning to target propaganda from other groups including al-Qaeda.

Charlie Winter, a senior research at the International Centre for the Study of Radicalisation and Political Violence (ICSR) at King's College London, said it could have a “significant impact”.

“If the big technology companies jump on board it will make it difficult to find this material on the open internet, but there will always be video sharing platforms that won’t take on this software,” he told The Independent.

“It’s a step in the right direction but it won’t solve the problem. Censorship is limited to contracting the reach of this material and it will never be able to eradicate it from the internet.”

The technology is only targeting videos and will not be able to detect Isis magazines, newspapers, photo sets and text-based propaganda.

The terrorist group’s prolific output decreased markedly in October, November and December – in the wake of military operations that drove jihadis out of key strongholds including the Syrian city of Raqqa – but has since recovered to former levels.

Mr Winter said the new technology will not have an impact on “the card-carrying supporters of Isis” who get their information from Telegram but may “limit the opportunity of people who are curious and vulnerable to expose themselves to this kind of material”.

Amber Rudd will discuss the model during meetings on Tuesday with communication companies in San Francisco’s Silicon Valley, including Vimeo, Telegra.ph and pCloud.

The Home Secretary is expected to tell the Global Internet Forum to Counter Terrorism that all five attacks on UK soil last year “had an online component” and extremists are being increasingly influenced by material viewed on the internet.

“I hope this new technology the Home Office has helped develop can support others to go further and faster,” Ms Rudd will say.

“The purpose of these videos is to incite violence in our communities, recruit people to their cause, and attempt to spread fear in our society. We know that automatic technology like this, can heavily disrupt the terrorists’ actions, as well as prevent people from ever being exploited to these horrific images.

“This Government has been taking the lead worldwide in making sure that vile terrorist content is stamped out.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in