14:58:25 Friday, November 22
Politics Economy Agriculture Society IT Education Medicine Religion Communal Services Incidents Crime Culture Sport

Human Rights Watch calls for banning autonomous lethal weapons

11:16 | 22.04.2015 | Analytic

Print

22 April 2015. PenzaNews. The international human rights organization Human Rights Watch (HRW) is calling for a full worldwide ban on development and use of autonomous lethal weapons (ALW). The call to action is fully formulated in a 38-page report titled “Mind the Gap: The Lack of Accountability for Killer Robots,” published by HRW together with the Harvard Law School in April 2015.

Photo: Wikipedia.org

© PenzaNewsBuy the photo

ALWs, the authors remind, are a new stage of evolution of unmanned vehicles (drones), already a target of criticism and outrage of human rights activists. Such combat systems, nicknamed “killer robots,” would be able to select and engage targets in combat without a human operator.

“Fully autonomous weapons do not yet exist, but technology is moving in their direction, and precursors are already in use or development. For example, many countries use weapons defense systems – such as the Israeli Iron Dome and the US Phalanx and C-RAM – that are programmed to respond automatically to threats from incoming munitions. In addition, prototypes exist for planes that could autonomously fly on intercontinental missions (UK Taranis) or take off and land on an aircraft carrier (US X-47B),” says the report.

According to human rights activists, ALWs already have many proponents who praise the positive features of “killer robots,” such as quick response; immunity to anger, fear and other emotions; and reduced losses for the military. However, the autonomous lethal weapons also have significant drawbacks, say HRW and the Harvard Law School.

“They would possess the ability to select and engage their targets without meaningful human control. Many people question whether the decision to kill a human being should be left to a machine,” the document stresses.

Moreover, according to the authors, a technological breakthrough that would allow autonomous lethal weapons to exist might lead to a new arms race and unchecked terror by groups in control of the “killer robots.”

“Once developed, fully autonomous weapons would likely proliferate to irresponsible states or non-state armed groups, giving them machines that could be programmed to indiscriminately kill their own civilians or enemy populations. Some critics also argue that the use of robots could make it easier for political leaders to resort to force because using such robots would lower the risk to their own soldiers; this dynamic would likely shift the burden of armed conflict from combatants to civilians,” write the human rights and law activists.

Moreover, the document says, a robot’s inability to feel any emotions is not just an advantage, but also an insurmountable obstacle to following legal and human rights norms. A machine that completely obeys orders from within has no individual judgment and lacks compassion, a main “safeguard” emotion that prevents civilian deaths and criminal acts.

Moreover, the authors point out that the mere fact of existence of ALWs may create series of insolvable legal issues. In particular, due to their autonomous nature, one cannot fully classify “killer drones” as machines and weapons of war; however, as they lack the qualities people have, they could not be found guilty of committing crimes.

“Existing mechanisms for legal accountability are ill-suited and inadequate to address the unlawful harms fully autonomous weapons might cause. These weapons have the potential to commit criminal acts – unlawful acts that would constitute a crime if done with intent – for which no one could be held responsible. A fully autonomous weapon itself could not be found accountable for criminal acts that it might commit because it would lack intentionality,” the human rights proponents state.

Moreover, they note, it would be unreasonable to sentence a weapon system to punishment and expect such a decision to have any outcome, as this act would not have any effect on a self-awareness-lacking robot’s actions in the future.

According to representatives of Human Rights Watch and the Harvard Law School, it would be prohibitively difficult for people who suffered harm from the actions of ALWs to hold those directly or indirectly related to creation and use of “killer drones” accountable in a criminal sense.

“Human commanders or operators could not be assigned direct responsibility for the wrongful actions of a fully autonomous weapon, except in rare circumstances when those people could be shown to have possessed the specific intention and capability to commit criminal acts through the misuse of fully autonomous weapons,” says the report.

Moreover, the document notes that users and manufacturers of autonomous lethal weapons would most likely escape civil liability for negligence. Among other things, an elaborate construction of a “killer drone” will make it difficult to locate a possible defect in the machine’s hardware or software. In addition, the authors stress, some countries, including the US, have installed legal protection for such individuals that would be insurmountable for most citizens.

“The military is immune from lawsuits related to: (1) its policy determinations, which would likely include a choice of weapons, (2) the wartime combat activities of military forces, and (3) acts committed in a foreign country. Manufacturers contracted by the military are similarly immune from suit when they design a weapon in accordance with government specifications and without deliberately misleading the military. These same manufacturers are also immune from civil claims relating to acts committed during wartime,” the document explains.

At the same time, the proponents of autonomous weapons are suggesting several ways to correct the situation: one of them, according to the authors, implies creating a new legal system that would provide compensation for the people who suffered from actions of “killer robots” without establishing fault. However, according to them, such measures would not only fail to achieve justice, but may also create a climate of impunity for those who control the fully autonomous weapons.

Overall, the human rights activists say, a ban for development, production and use of fully autonomous weapons would be the most effective and reasonable choice from technical, moral and legal points of view.

“An absolute, legally binding ban on fully autonomous weapons would provide several distinct advantages over formal or informal constraints. It would maximize protection for civilians in conflict because it would be more comprehensive than regulation. It would be more effective as it would prohibit the existence of the weapons and be easier to enforce. A ban could have a powerful stigmatizing effect, creating a widely recognized new standard and influencing even those that did not join a treaty,” the report points out. 

Lastest headlines
Read also