by Jason Mick
Thus far, no nation has produced a fully autonomous robotic soldier.
I. Human Rights Watch Warns of Robotic War Crimes
However, many observers fear we are creeping towards an era in which automated killing machines are a staple of the battlefield. The U.S. and other nations have been actively been developing land, air, and sea unmanned vehicles. Most of these machines are imbued with some degree of artificial intelligence and operate in a semi-autonomous fashion. However, they currently have a human operator in the loop, (mostly) in control.
But experts fear that within 20 to 30 years artificial intelligence and military automation will have advanced to the point where nations consider deploying fully automated war robots to kill their enemies.
International humanitarian group and war-crimes watchdog Human Rights Watch has published a 50-page report entitled “Losing Humanity: The Case Against Killer Robots“, which calls on world governments to install a global ban on autonomous killing robots, similar to current prohibitions on the use of chemical warfare agents.
Comments Steve Goose, Arms Division director at Human Rights Watch, “Giving machines the power to decide who lives and dies on the battlefield would take technology too far. Human control of robotic warfare is essential to minimizing civilian deaths and injuries. It is essential to stop the development of killer robots before they show up in national arsenal. As countries become more invested in this technology, it will become harder to persuade them to give it up.”
II. Ban the ‘Bots
The groups address the counter-argument — that robotic warfare saves the lives of soldiers — arguing that it makes war too convenient. They argue that an “autocrat” could turn cold, compassionless robots on killing their own civilian population. It would be much harder to convince humans to do that.
. . .
Pull the Plug on Killer Robots
Countries could also claim their cyber-soldiers “malfunctioned” to try to get themselves off the hook for war crimes against other nations’ civilians.
And of course science fiction fans will recognize the final concern — that their could be legitimate bugs in the AI which cause the robots to either not properly calculate a proportional response to violence, to not distinguish between civilian or soldier, or — worst of all “go Terminator” and turn on their fleshy masters.
Comments Mr. Goose, “Action is needed now, before killer robots cross the line from science fiction to feasibility.”