The Daily Star’s FREE newsletter is spectacular! Sign up today for the best stories straight to your inbox
An autonomous weaponised drone “hunted down” a human target last year and is thought to have attacked them without being specifically ordered to, according to a report prepared for the United Nations.
The news raises the spectre of terminator-style AI weapons killing on the battlefield without any human control.
The drone, a Kargu-2 quadcopter produced by Turkish military tech company STM, was deployed in March 2020 during a conflict between Libyan government forces and a breakaway military faction led by Khalifa Haftar, commander of the Libyan National Army.
The Kargu-2 is fitted with an explosive charge and the drone can be directed at a target in a kamikaze attack, detonating on impact.
The report from the UN Security Council’s Panel of Experts on Libya, published in March 2021, was obtained by New Scientist magazine.
In one passage the repots details how Haftar’s were “hunted down” as they retreated by Kargu-2 drones that were operating in a “highly effective” autonomous mode that required no human controller.
“The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true ‘fire, forget and find’ capability,” the report says.
It suggests that the drones were attacking human beings on their own initiative.
There is no record of how many casualties, if any, the AI war machines inflicted.
Zak Kallenborn at the National Consortium for the Study of Terrorism and Responses to Terrorism in Maryland, could be the first time that drones have autonomously attacked humans.
He says this development is cause for serious concern, given that AI systems can not always interpret visual data correctly.
“How brittle is the object recognition system?” Kallenborn asks. “… how often does it misidentify targets?”
Jack Watling at UK defence think tank Royal United Services Institute, told New Scientist that the drones are in something of a grey area when it comes to regulation of AI weapons, because only the drones’ controllers would know whether the machines were being remotely controlled at the time of the attack.
“This does not show that autonomous weapons would be impossible to regulate,” he says. “But it does show that the discussion continues to be urgent and important. The technology isn’t going to wait for us.”
- Artificial Intelligence
Source: Read Full Article