Human Control of Robot Weaponry Urged by Arms Control Groups

A new generation of weapons that are increasingly capable of targeting and attacking without the involvement of people require the maintaining human control over them, says two international arms control groups in a report.

An emerging United States military strategy that will count on technology advantages and increasingly depend on weapons systems that blend humans and machines has been potentially challenged by the report which came from Human Rights Watch and the Harvard Law School International Human Rights Clinic at the opening of a week long United Nations meeting on autonomous weapons in Geneva.

The Third Offset strategy – as it is now called, seeks to exploit technologies to maintain American military superiority. The new technologies — and particularly artificial intelligence software — will help, rather than replace, human soldiers who must make killing decisions, the Pentagon officials have recently stated.

The report, titled “Killer Robots and the Concept of Meaningful Human Control,” said: “machines have long served as instruments of war, but historically humans have always dictated how they are used.”

An international debate is now emerging over whether it is possible to limit the evolution of weapons that make killing decisions without human involvement even as some have argued that in the future, autonomous weapons might be able to better adhere to the laws of war than humans.

Commanders and weapons operators need to exercise “appropriate levels of human judgment” over the use of force, said the current United States military guidelines, published in 2012. While not completely prohibit autonomous weapons, the guidelines require that high-level Pentagon officials authorize them. A distinction has been made between fully autonomous weapons that can hunt and engage targets without intervention and semiautonomous weapons, whose targets are chosen by a person.

This distinction would be vital in relation to new weapons that will enter the United States arsenal as early as 2018. Initially designed by the Defense Advanced Research Projects Agency and to be manufactured by Lockheed Martin, the long Range Anti-Ship Missile, or L.R.A.S.M. is one example. This year, the Pentagon asked Congress to authorize $927 million over the next five years for the system. Concerns that American carriers will be required to operate farther from China because of its growing military power is one of the primary reasons that missile is being developed.

The missile is designed to be launched by a human operator and then fly to a targeted ship out of human contact and make final targeting decisions autonomously and this has raised concerns among critics.

“I would argue that L.R.A.S.M. is intended primarily to threaten China and Russia and is only likely to be used in the opening shots of a nuclear war that would quite likely destroy our civilization and kill a large fraction, or most, or nearly all human beings,” said Mark A. Gubrud, a physicist and member of the International Committee for Robot Arms Control, a group working for the prohibition of autonomous weapons.

“We urge states to provide more information on specific technology so the international community can better judge what type and level of control should be required,” Ms. Docherty said.

Missiles and drones that carry out attacks against enemy radar, or tanks without direct human control have also been deployed by Britain, Israel and Norway.

Some nations are now calling for some kind of international agreement that limits the weapons.

“There seems to be a broad consensus that, at some level, humans should be involved in lethal force,” said Paul Scharre, a senior fellow at the Center for New American Security in Washington.

(Adapted from nytimes.com)

Leave a comment