With advancements in robotics, it is assumed that in a short amount of time, the US military may turn to a synthetic army to replace human soldiers.
Roboticist Illah Nourbakhsh envisions : “If researchers set out to build a robot that can drive a regular car, climb a ladder and operate a jack hammer, “That means that that robot can manipulate an AK-47. That means that robot can manipulate the controls of all the conventional military machines as well.”
Desensitizing our military forces to the gore of war would bring about moral issues with the ideology behind becoming comfortable with killing as a culture.
Former Commander General Stanley McCrystal warned : “There’s a danger that something that feels easy to do and without risk to yourself, almost antiseptic to the person shooting, doesn’t feel that way at the point of impact.”
Noel Sharkey, professor of Artificial Intelligence and Robotics at Sheffield University warns that think-tanks and the rise of the military industrial complex will culminate in the creation of synthetic armies that could turn on their creators should they be equipped with autonomous thought programing.
Sharkey explains: “With the current prices of robot construction falling dramatically and the availability of ready-made components for the amateur market, it wouldn’t require a lot of skill to make autonomous robot weapons.”
The Human Rights Watch (HRW) has released a report entitled “Losing Humanity: The Case Against Killer Robots” which warns that autonomous synthetic armed forces lack conscious empathy that human soldiers have and could perform lethal missions without provocation.
Autonomous synthetic robots used as weapons cannot inherently conform to “the requirements of international humanitarian law” as they cannot adequately distinguish “between soldiers and civilians on the battlefield or apply the human judgment necessary to evaluate the proportionality of an attack – whether civilian harm outweighs military advantage.”
The HRW report states: “Human emotions provide one of the best safeguards against killing civilians, and a lack of emotion can make killing easier. Emotions should be viewed as central to restraint in war.”
Indeed, the responsibility factor is questionable on a legal stand-point because who is ultimately responsible for the actions of a synthetic armed force robot? Would the ultimate charge fall to the:
As the Defense Research Advanced Projects Agency (DARPA) programs research forward for autonomous robotics to be used for “disaster relief” purposes, Brian Gerkley, spokesperson for the Open Source Robotics Foundation (OSRF) believes “there is good evidence that if we had been able to send in some kind of robot and had that robot do relatively simple things, simple manual tasks like opening valves, opening doors, getting to control panels, a lot of the following disaster could have been averted.”
In 2013, it was reported that Russia was creating their own killer robots that will be designed to eliminate terrorists.
Deputy Prime Minister Dimitry Rogozin, who also oversees defense, explained that these robots are being developed by Russian experts to “minimize casualties in terrorist attacks and neutralize terrorists.”
Working as service “men”, the robots would assist in disaster scenarios. In the event of a terrorist situation, these robots would be utilized because of their specialized ability to “see terrorists through obstacles and effectively engage them in a standoff mode at a long distance without injuring their hostages.”
According to reports : “Russia says it’s developing special robots that will neutralize terrorists and help minimalize the causalities of a terrorist attack. The head of Russia’s Defense Industry says that the killer robots will also be able to evacuate injured civilians and servicemen from the scene. Dmitry Rogozin says that Moscow is also creating systems that can see terrorists through obstacles and engage them in a standoff without injuring hostages.”
The Pentagon has developed a Multi-Robot Pursuit System that utilizes “software and sensor package to enable a team of robots to search for and detect human presence in an indoor environment”.
In the wake of Sandy, the Pentagon has requested that a team of “rescue robots” be engineered in time for the next “natural disaster”.
The DARPA Robotics Challenge is putting out the call for a synthetic force that can be designed for autonomous thought; yet mitigate the risk to human life when preforming a rescue mission.
By remote-control, soldiers can use these semi-autonomous robots to research areas of interest before human investigations are conducted.
Because of the risks involved in rescue aid workers and human response teams, DARPA awarded the now Google-owned Boston Dynamics, Inc. a $10.9 million contract to manufacture humanoid robots that are bi-pedal, built like humans and have a sensor head with on-board computing capabilities.
Completion of the project is expected for August of 2014.