2017 TowardaBanonLethalAutonomousWea

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Lethal Autonomous Weapon.

Notes

Cited By

Quotes

Abstract

A 10-point plan toward fashioning a proposal to ban some --- if not all --- lethal autonomous weapons.

Introduction

From April 11–15, 2016, at the United Nations Office at Geneva, the Convention on Certain Conventional Weapons (CCW) conducted a third year of informal meetings to hear expert testimony regarding a preemptive ban on lethal autonomous weapons systems (LAWS). A total of 94 states attended the meeting, and at the end of the week they agreed by consensus to recommend the formation of an open-ended Group of Government Experts (GGE). A GGE is the next step in forging a concrete proposal upon which the member states could vote. By the end of 2016 a preemptive ban has been called for by 19 states. Furthermore, meaningful human control, a phrase first proposed by advocates for a ban, has been adopted by nearly all the states, although the phrase's meaning is contested. Thus a ban on LAWS would appear to have gained momentum. Even the large military powers, notably the U.S., have publicly stated that they will support a ban if that is the will of the member states. Behind the scenes, however, the principal powers express their serious disinclination to embrace a ban. Many of the smaller states will follow their lead. The hurdles in the way of a successful campaign to ban LAWS remain daunting, but are not insurmountable.

The debate to date has been characterized by a succession of arguments and counterarguments by proponents and opponents of a ban. This back and forth should not be interpreted as either a stalemate or a simple calculation as to whether the harms of LAWS can be offset by their benefits. For all states that are signatories to the laws of armed conflict,a any violation of the principles of international humanitarian law (IHL)b must trump utilitarian calculations. Therefore, those who believe the benefits of LAWS justify their use and therefore oppose a ban, are intent that LAWS do not become a special case within IHL. Demonstrating that LAWS pose unique challenges for IHL has been a core strategy for supporters of a ban.

Those among the more than 3,100 AI/Robotics researchers who signed the Autonomous Weapons: An Open Letter From AI & Robotics Researchersc are reflective of a broad consensus among citizens and even active military personnel who favor a preemptive ban.4 This consensus is partially attributable to speculative, futuristic, and fictional scenarios. But perhaps even science fiction represents a deep intuition that unleashing LAWS is not a road humanity should tread.

Researchers who have waded into the debate over banning LAWS have come to appreciate the manner in which geopolitics, security concerns, the arcana of arms control, and linguistic obfuscations can turn a relatively straightforward proposal into an extremely complicated proposition. A ban on LAWS does not fit easily, or perhaps at all, into traditional models for arms control. If a ban, or even a moratorium, on the development of LAWS is to progress, it must be approached creatively.

I favor and have been a long-time supporter of a ban. While a review of the extensive debate as to whether LAWS should be banned is well beyond the scope of this paper, I wish to share a few creative proposals that could move the campaign to ban LAWS forward. Many of these proposals were expressed during my testimony at the CCW meeting in April and during a side luncheon event.d Before introducing those proposals, let me first point out some of the obstacles to fashioning an arms control agreement for LAWS.

A 10-Point Plan

Many of the barriers to fitting a ban on LAWS into traditional approaches to arms control can be overcome by adopting the following approach.

  1. Rather than focus on establishing a bright line or clear definition for lethal autonomy, first establish a high order moral principle that can garner broad support. My candidate for that principle is:Machines, even semi-intelligent machines, should not be making life and death decisions. Only moral agents should make life and death decisions about humans. Arguably, something like this principle is already implicit, but not explicit, in existing international humanitarian law, also known as the laws of armed conflict (LOAC).3 A higher order moral principle makes explicitly clear what is off limits, while leaving open the discussion of marginal cases where a weapon system may or may not be considered to be making life and death decisions.
  2. Insist that meaningful human control and making a life and death decision requires the real-time authorization from designated military personnel for a LAW to kill a combatant or destroy a target that might harbor combatants and non-combatants alike. In other words, it is not sufficient for military personnel to merely delegate a kill order in advance to an autonomous weapon or merely be "on-the-loop"h of systems that can act without a real time go-ahead.
  3. Petition leaders of states to declare that LAWS violate existing IHL. In the U.S. this would entail a Presidential Order to that effect.i,14
  4. Review marginal or ambiguous cases to set guidelines for when a weapon system is truly autonomous and when its actions are clearly the extension of a military commander's will and intention. Recognize that any definition of autonomy will leave some cases ambiguous.
  5. Underscore that some present and future weapon system will occasionally act unpredictably and most LAWS will be difficult if not impossible to test adequately.
  6. Present compelling cases for banning at least some, if not all, LAWS. In other words, highlight situations in which nearly all parties will support a ban. For example, no nation should want LAWS that can launch nuclear warheads.
  7. Accommodate the fact that there will be necessary exceptions to any ban. For example, defensive autonomous weapons that target unmanned incoming missiles are already widely deployed.j These include the U.S. Aegis Ballistic Missile Defense System and Israel's Iron Dome.
  8. Recognize that future technological advances may justify additional exceptions to a ban. Probably the use of LAWS to protect refugee non-combatants would be embraced as an exception. Whether the use of LAWS in a combat zone where there are no non-combatants should be treated as an exception to a ban would need to be debated. Offensive autonomous weapon systems that do not target humans, but only target, for example, unmanned submarines, might be deemed an exception.
  9. Utilize the unacceptable LAWS to campaign for a broad ban, and a mechanism for adding future exceptions.
  10. Demand that the onus of ensuring that LAWS will be controllable, and that those who deploy the LAWS will be held accountable, lies with those parties who petition for, and deploy, an exception to the ban.

Conclusion

The short-term benefits of LAWS could be far outweighed by long-term consequences. For example, a robot arms race would not only lower the barrier to accidentally or intentionally start new wars, but could also result in a pace of combat that exceeds human response time and the reflective decision-making capabilities of commanders. Small low-cost drone swarms could turn battlefields into zones unfit for humans. The pace of warfare could escalate beyond meaningful human control. Military leaders and soldiers alike are rightfully concerned that military service will be expunged of any virtue.

In concert with the compelling legal and ethical considerations LAWS pose for IHL, unpredictability and risk concerns suggest the need for a broad prohibition. To be sure, even with a ban, bad actors will find LAWS relatively easy to assemble, camouflage, and deploy. The Great Powers, if they so desire, will find it easy to mask whether a weapon system has the capability of functioning autonomously.

The difficulties in effectively enforcing a ban are perhaps the greatest barrier to be overcome in persuading states that LAWS are unacceptable. People and states under threat perceive advanced weaponry as essential for their immediate survival. The stakes are high. No one wants to be at a disadvantage in combating a foe that violates a ban. And yet, violations of the ban against the use of biological and chemical weapons by regimes in Iraq and in Syria have not caused other states to adopt these weapons.

The power of a ban goes beyond whether it can be absolutely enforced. The development and use of biological and chemical weapons by Saddam Hussein helped justify the condemnation of the regime and the eventual invasion of Iraq. Chemical weapons use by Bashar al-Assad has been widely condemned, even if the geopolitics of the Syrian conflict have undermined effective follow-through in support of that condemnation.

References

  • 1. Arkin, R. The Case for Banning Killer Robots: Counterpoint. Commun. ACM 58, 12 (Dec. 2015), 46--47.
  • 2. Arkin, R. Governing Lethal Behavior in Autonomous Systems. CRC Press, Boca Raton, FL, 2009.
  • 3. Asaro, P. On Banning Autonomous Lethal Systems: Human Rights, Automation and the Dehumanizing of Lethal Decision-making. Special Issue on New Technologies and Warfare. International Review of the Red Cross 94, 886 (Summer 2012), 687--709.
  • 4. Carpenter, C. How Do Americans Feel About Fully Autonomous Weapons? The Duck of Minerva (June 19, 2013); http://bit.ly/2mBKMnR
  • 5. Gormley, D.M. Missile Contagion. Praeger Security International, 2008.
  • 6. Gubrud, M. and Altmann, J. Compliance Measures for An Autonomous Weapons Convention, ICRAC Working Paper Series #2, International Committee for Robot Arms Control (2013); http://bit.ly/2nf0LFu
  • 7. Peck, M. Global Hawk Crashes: Who's to Blame? National Defense 87, 594 (2003); http://bit.ly/2mQJgeJ
  • 8. Perrow, C. Normal Accidents: Living With High-Risk Technologies. Basic Books, New York, 1984.
  • 9. Pinker, S. The Better Angels of Our Nature: Why Violence Has Declined. Penguin, 2011.
  • 10. Roff, H. and Danks, D. Trust But Verify: The Difficulty of Trusting Autonomous Weapons Systems. Journal of Military Ethics. (Forthcoming).
  • 11. Sagan, S.D. The Limits of Safety: Organizations, Accidents, and Nuclear Weapons. Princeton University Press, Princeton, NJ, 2013.
  • 12. Taleb, N.N. The Black Swan: The Impact of the Highly Improbable. Random House, 2007.
  • 13. Wallach, W. Terminating the Terminator. Science Progress, 2013; http://bit.ly/2mjl2dy
  • 14. Wallach, W. and Allen, C. Framing Robot Arms Control. Ethics and Information Technology 15, 2 (2013), 125--135.

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2017 TowardaBanonLethalAutonomousWeaWendell WallachToward a Ban on Lethal Autonomous Weapons: Surmounting the Obstacles10.1145/29985792017