‘Killer robots’ to provide ‘accountability gap’ for military, must be banned – HRW

10 Apr, 2015 00:16 / Updated 10 years ago

Future use of fully autonomous weapons or ‘killer robots’ may provide a loophole for the military to escape responsibility for unlawfully killing or injuring civilians, a report by Human Rights Watch says.

READ MORE: UN sounds alarm on rise of autonomous 'killer robots'

There are significant obstacles to assigning personal accountability for the actions performed by fully autonomous weapons under both criminal and civil law, a paper, entitled Mind the Gap: The Lack of Accountability for Killer Robots, stated.

The 38-page document is jointly published by Human Rights Watch (HRW) and Harvard Law School’s International Human Rights Clinic.

READ MORE: State Dept. prepares to give armed drones to allies

‘Killer robots’ are still something from science-fiction, but their possible development in the future “raises serious moral and legal concerns because they would possess the ability to select and engage their targets without meaningful human control,” the report said.

“There are also grave doubts that fully autonomous weapons would ever be able to replicate human judgment and comply with the legal requirement to distinguish civilian from military targets,” it added.

The authors of the paper also warned of “the prospect of an arms race and proliferation to armed forces with little regard for the law” due to the rise of by fully autonomous military hardware.

With ‘killer robots’ being unable to substitute responsible humans in court, it is likely that people behind their use, including military commanders, programmers and developers, would escape liability for crimes committed by machines, the report said.

“No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party,” Bonnie Docherty, senior Arms Division researcher at HRW and the report’s lead author, told HRW’s website. “The many obstacles to justice for potential victims show why we urgently need to ban fully autonomous weapons.”

READ MORE: Russian avatar cyborg, crack shot & quad bike rider, meets Putin (VIDEO)

According to the human rights watchdog, the military officials could be found guilty if they intentionally deployed a machine to commit a crime, but they would most certainly escape justice in cases when they couldn’t foresee the unlawful actions by the ‘killer robot.’

“A fully autonomous weapon could commit acts that would rise to the level of war crimes if a person carried them out, but victims would see no one punished for these crimes. Calling such acts an ‘accident’ or ‘glitch’ would trivialize the deadly harm they could cause,” Docherty stressed.

In order to deal with the “accountability gap,” which would come in case of use of autonomous, the authors of the report recommend to “prohibit the development, production and use of fully autonomous weapons through an international legally binding instrument” and come up with national laws promoting the ban.

READ MORE: US Navy wants robots to train Marines

Over 50 NGOs from around the globe are pushing for a preemptive ban on the development, production, and use of fully autonomous weapons, with HRW being co-founder and coordinator of the Campaign to Stop Killer Robots.

The report was put together ahead of a major international meeting on ‘lethal autonomous weapons systems’ (LAWS) at the UN in Geneva on April 13–17, and will be distributed during the event. The session will discuss additions to the Convention on Certain Conventional Weapons. The treaty has already banned several emerging forms of military technology: blinding lasers were blacklisted in 1995, while in 2006 warring parties were required to remove unexploded cluster bombs.

“Fully autonomous weapons do not yet exist,” the report admits. “But technology is moving in their direction, and precursors are already in use or development. For example, many countries use weapons defense systems – such as the Israeli Iron Dome and the US Phalanx and C-RAM – that are programmed to respond automatically to threats from incoming munitions.

Prototypes exist for planes that could autonomously fly on intercontinental missions, such as the UK’s Taranis, or take off and land on an aircraft carrier, like the US’s X-47B. Drones, despite being a powerful military means, are also contributing to controversies though they are directed by live operators from distance.

READ MORE: Child or militant? 6th-grader killed in US drone strike in Yemen (VIDEO)