A pause in progress to "a world where machines are given the power to kill humans," was urged today by a United Nations independent human rights expert who called for a global moratorium on the development and deployment of lethal autonomous robots (LARs).
LARs differ from armed drones and other remotely controlled weapons systems because they have the ability to decide when to attack a target, Christof Heyns, Special Rapporteur on extrajudicial, summary or arbitrary executions, stressed.
"While drones still have a 'human in the loop' who takes the decision to use lethal force, LARs have on-board computers that decide who should be targeted," Mr. Heyns said as he presented his latest report to the UN Human Rights Council.
"War without reflection is mechanical slaughter," he added. "In the same way that the taking of any human life deserves as a minimum some deliberation, a decision to allow machines to be deployed to kill human beings deserves a collective pause worldwide."
While much of their development is shrouded in secrecy, robots with full lethal autonomy have not yet been deployed, according to Mr. Heyns' report. However, robotic systems with various degrees of autonomy and attack capability are currently in use
He cites, for example, Samsung Techwin surveillance and security guard robots, deployed in the demilitarized zone between the two countries of the Korean peninsula, which detect targets through infrared sensors. They are currently operated by humans but have an "automatic mode."
The United States' Navy's Phalanx gun system, he adds, automatically detects, tracks and engages threats such as anti-ship missiles and aircraft. Israel's Harpy system is designed to detect and destroy radar emitters.
In addition, the British Taranis drone prototype can search, identify and locate enemies, but can only engage a target when authorized by mission command.
He says that further development of such systems and their decision-making capability may make it easier for States to go to war; and raises the question whether they can be programmed to comply with the requirements of international humanitarian law, "especially the distinction between combatant and civilians and collateral damage."
"Beyond this, their deployment may be unacceptable because no adequate system of legal accountability can be devised for the actions of machines," he stated as part of his analysis of potential violations of the rights to life and human dignity, should the use of LARs materialize.
Mr. Heyns urged the Human Rights Council to call on all States "to declare and implement national moratoria on the production, assembly, transfer, acquisition, deployment and use of LARs, until a framework on the future of LARs has been established."
He invited the UN High Commissioner for Human Rights to convene or to work with other UN bodies to convene a High Level Panel on LARs to articulate this framework.
The Special Rapporteur stressed that engaging with the risks posed by LARs right now, before further development, would provides an opportunity for reflection that would contrast to other revolutions in military affairs, where serious consideration mostly began after the emergence of new methods of war.
"The current moment may be the best we will have to address these concerns," he said.