Military drones may have attacked humans for first time without being instructed to, UN report says
Drones may have autonomously ‘hunted down and remotely engaged’ fleeing soldiers
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.A military drone may have autonomously attacked humans for the first time without being instructed to do so, according to a recent report by the UN Security Council.
The report, published in March, claimed that the AI drone – Kargu-2 quadcopter – produced by Turkish military tech company STM, attacked retreating soldiers loyal to Libyan General Khalifa Haftar.
The 548-page report by the UN Security Council’s Panel of Experts on Libya has not delved into details on if there were any deaths due to the incident, but it raises questions on whether global efforts to ban killer autonomous robots before they are built may be futile.
Over the course of the year, the UN-recognized Government of National Accord pushed the Haftar Affiliated Forces (HAF) back from the Libyan capital Tripoli, and the drone may have been operational since January 2020, the experts noted.
“Logistics convoys and retreating HAF were subsequently hunted down and remotely engaged by the unmanned combat aerial vehicles or the lethal autonomous weapons systems such as the STM Kargu-2,” the UN report noted.
Kargu is a “loitering” drone that uses machine learning-based object classification to select and engage targets, according to STM, and also has swarming capabilities to allow 20 drones to work together.
“The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true ‘fire, forget and find’ capability,” the experts wrote in the report.
Many robotics and AI researchers in the past, including Elon Musk, and several other prominent personalities like Stephen Hawking and Noam Chomsky have calledfor a ban on "offensive autonomous weapons", such as those with the potential to search for and kill specific people based on their programming.
Experts have cautioned that the datasets used to train these autonomous killer robots to classify and identify objects such as buses, cars and civilians may not be sufficiently complex or robust, and that the artificial intelligence (AI) system may learn wrong lessons.
They have also warned of the “black box” in machine learning, in which the decision making process in AI systems is often opaque, posing a real risk of fully autonomous military drones executing the wrong targets with the reasons remaining difficult to unravel.
Zachary Kallenborn, a national security consultant specialising in unmanned aerial vehicles, believes there is greater risk of something going wrong when several such autonomous drones communicate and coordinate their actions, such as in a drone swarm.
“Communication creates risks of cascading error in which an error by one unit is shared with another,” Kallenborn wrote in The Bulletin.
“If anyone was killed in an autonomous attack, it would likely represent an historic first known case of artificial intelligence-based autonomous weapons being used to kill,” he added.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments