CNAS is an independent non-profit research institution that has had an ongoing project on the legal, ethical, moral, and policy implications of autonomous weapons. I would like to share with you some of the findings from our research on technical issues.
Today, with a few isolated exceptions, lethal, fully autonomous weapons do not exist. However, automation and autonomy continue to be incorporated into a wide variety of functions, including those relating to the use of force. Even if efforts to pre-emptively ban the use of lethal autonomous weapon systems are largely successful, some states are likely to continue to pursue their development and – potentially – to one day deploy them. The international community has wisely chosen to begin preparing for a time when states have the ability to build weapon systems that can autonomously select and engage targets without human intervention.
Lethal autonomous weapon systems are not inherently counter to the principles of international humanitarian law. If properly designed and employed in a manner that is consistent with the requirements of proportionality and distinction, the use of such systems could be lawful. For this reason, it will be critical to develop shared expectations about standards for testing, evaluation, employment, and accountability.
A number of states and non-governmental organizations have expressed concerns that LAWS cannot distinguish between military and civilian contexts and therefore could select and engage a military target located in a civilian environment, resulting in civilian casualties. Such concerns are valid but could be mitigated by restrictions on the circumstances in which LAWS can be employed. For example, use of LAWS could be limited to the undersea domain or to other areas where civilians are not present.
Likewise, some concerns about the use of LAWS could be mitigated by design characteristics that constrain their freedom of operation. These constraints could include limits on both the length of time and the geographic area in which the system is allowed to operate. This would, in turn, increase a commander’s control over the system and strengthen accountability for the system’s use. LAWS could also be designed to target only military hardware – providing an additional layer of protection against the targeting of civilian.
Regardless of the system’s exact degree of autonomy, states should implement strict testing and evaluation standards to ensure that the system performs as intended and that procedures for its use are clear to operators. Systems should be tested in a realistic operating environment to ensure that they continue to operate in accordance with their design features.
Finally, it will be essential for states to further explore the ways in which accountability for the use of LAWS can be established and enforced. While such an undertaking will be undoubtedly challenging, it will be critical in ensuring that the use of LAWS adheres to international humanitarian law.
Such issues represent important areas for discussion. Failure to establish a shared understanding of appropriate standards for testing, evaluation, employment, and accountability could have profound consequences for the future of international security.
These are important technical issues that should be considered during discussion this week.
More from CNAS
PodcastFire and Ice
In this week’s edition of the SpyTalk podcast, Jeff Stein goes deep on the CIA’s looming eviction from Afghanistan with Lisa Curtis, a longtime former CIA, State Department an...
By Lisa Curtis, Jeff Stein, Jeanne Meserve & Alma Katsu
ReportsPrinciples for the Combat Employment of Weapon Systems with Autonomous Functionalities
Introduction An international debate over lethal autonomous weapon systems (LAWS) has been under way for nearly a decade.1 In 2012, the Department of Defense (DoD) issued for...
By Robert O. Work
PodcastAre ‘killer robots’ the future of warfare?
Paul Scharre joins host Suzanna Kianpour to discuss the technology, fears and even potential advantages of developing autonomous weapons. Listen to the full conversation from...
By Paul Scharre
CommentaryThe Militarization of Artificial Intelligence
Militaries are racing to adopt artificial intelligence (AI) with the aim of gaining military advantage over competitors. And yet, there is little understanding of AI’s long-te...
By Paul Scharre