April 14, 2015

Statement to the UN Convention on Certain Conventional Weapons on Technical Issues

By Kelley Sayler

CNAS is an independent non-profit research institution that has had an ongoing project on the legal, ethical, moral, and policy implications of autonomous weapons. I would like to share with you some of the findings from our research on technical issues.

Today, with a few isolated exceptions, lethal, fully autonomous weapons do not exist.  However, automation and autonomy continue to be incorporated into a wide variety of functions, including those relating to the use of force.   Even if efforts to pre-emptively ban the use of lethal autonomous weapon systems are largely successful, some states are likely to continue to pursue their development and – potentially – to one day deploy them. The international community has wisely chosen to begin preparing for a time when states have the ability to build weapon systems that can autonomously select and engage targets without human intervention.

Lethal autonomous weapon systems are not inherently counter to the principles of international humanitarian law.  If properly designed and employed in a manner that is consistent with the requirements of proportionality and distinction, the use of such systems could be lawful.  For this reason, it will be critical to develop shared expectations about standards for testing, evaluation, employment, and accountability.        

A number of states and non-governmental organizations have expressed concerns that LAWS cannot distinguish between military and civilian contexts and therefore could select and engage a military target located in a civilian environment, resulting in civilian casualties.  Such concerns are valid but could be mitigated by restrictions on the circumstances in which LAWS can be employed.  For example, use of LAWS could be limited to the undersea domain or to other areas where civilians are not present.

Likewise, some concerns about the use of LAWS could be mitigated by design characteristics that constrain their freedom of operation.  These constraints could include limits on both the length of time and the geographic area in which the system is allowed to operate.  This would, in turn, increase a commander’s control over the system and strengthen accountability for the system’s use.  LAWS could also be designed to target only military hardware – providing an additional layer of protection against the targeting of civilian. 

Regardless of the system’s exact degree of autonomy, states should implement strict testing and evaluation standards to ensure that the system performs as intended and that procedures for its use are clear to operators.  Systems should be tested in a realistic operating environment to ensure that they continue to operate in accordance with their design features.

Finally, it will be essential for states to further explore the ways in which accountability for the use of LAWS can be established and enforced. While such an undertaking will be undoubtedly challenging, it will be critical in ensuring that the use of LAWS adheres to international humanitarian law.

Such issues represent important areas for discussion.  Failure to establish a shared understanding of appropriate standards for testing, evaluation, employment, and accountability could have profound consequences for the future of international security.  

These are important technical issues that should be considered during discussion this week.

  • Commentary
    • Texas National Security Review
    • June 2, 2020
    The Militarization of Artificial Intelligence

    Militaries are racing to adopt artificial intelligence (AI) with the aim of gaining military advantage over competitors. And yet, there is little understanding of AI’s long-te...

    By Paul Scharre

  • Commentary
    • Wired
    • May 19, 2020
    Are AI-Powered Killer Robots Inevitable?

    In war, speed kills. The soldier who is a split second quicker on the draw may walk away from a firefight unscathed; the ship that sinks an enemy vessel first may spare i...

    By Paul Scharre

  • Commentary
    • The New York Times
    • February 12, 2020
    The Iranian Missile Strike Did Far More Damage Than Trump Admits

    Over 100 American soldiers have been treated for traumatic brain injuries following Iran’s missile strike on Al Asad Air Base in western Iraq. The strike came in retaliation f...

    By Loren DeJonge Schulman & Paul Scharre

  • Podcast
    • January 16, 2020
    Episode 26 - Paul Scharre

    What are autonomous weapons systems? How are they used in modern warfare? And how do we strengthen international cooperation? In this episode, the Director of the Technology a...

    By Paul Scharre

View All Reports View All Articles & Multimedia