UN envoy urges end to plans for battle-field killing machines
Ethical groups join call to halt machine-soldiers that identify and kill without human input
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.It’s a scenario that could have emerged from the imagination of a science fiction writer – killing machines stalking future battlefields with heat-seeking weapons so that human soldiers do not have to risk their lives.
But these machines are not confined to books and blockbuster action films. They are all very real – either already in use in conflict areas, or in development – as governments seek ways of exploiting technology to give them the edge on the battlefield. The existence of such “killer robots” is worrying Christof Heyns, the United Nations envoy on extra-judicial, summary or arbitrary executions. Presenting a report in Geneva, he called for a ban on developing robots which could identify and kill without any human input.
Mr Heyns warned that autonomous killing machines – not yet deployed in any battlefield – could blur the lines of command in war crimes cases and added that action must be taken before the technology overtakes existing legislation. “Time is of the essence. Trying to stop technology is a bit like trying to stop time itself – it moves on,” he said. His report argues that “modern technology allows increasing distance to be put between weapons users and the lethal force they project”.
That report is backed by the Campaign to Stop Killer Robots, a coalition of groups including Human Rights Watch, Amnesty International and Handicap International, which is calling for a halt in development of weapons which take the decision to shoot and kill out of human hands.
There has already been heated debate on the ethical implications of pilotless aircraft such as the Predator and Reaper drones, which are controlled from an air force base in Nevada, thousands of miles away from the mountains where they unload their ordnance.
But critics say this takes modern warfare too close to the realms of a computer game.
Ground robots currently deployed include the SGR-1, a robot fitted with a machine gun, which South Korea has installed along its border with its northern neighbour. While they are also not quite the dead-eyed androids wandering the dystopian landscapes of Ridley Scott’s Blade Runner, or the metal killing machines of the Terminator films, they are close enough to send a shiver down many a cinemagoer’s spine.
“The biggest problem in robotics is we’ve seen too much science fiction,” said Rich Walker, managing director of the Shadow Robot Company which researches and develops robotics.
He pointed out that robots are already deployed on the battlefields, performing vital tasks such as bomb disposal and argued that some responsive robots were actually little different from land mines and other booby traps which are set up by humans and which respond to stimuli.
“Autonomous robots should be seen as neither a good thing nor a bad thing, he told The Independent. “It’s the way they are deployed.”
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments