April 17, 2015

Statement to the UN Convention on Certain Conventional Weapons on Way Ahead

By Paul Scharre

Thank you Mr. Chairperson. We would like to take a moment to reflect on what we have heard this week. The international community has come together to discuss a challenging issue, the implications of increasing autonomy for military operations, particularly in the use of force. This is an important issue, and we commend states and members of civil society for engaging these issues.

They are also difficult issues. Unlike previous weapons that the CCW has dealt with, autonomous weapons generally do not yet exist, and so sometimes it is hard to envision exactly what autonomous weapons are and how they might be used.

It is clear that there is still work to be done in converging on a common understanding of autonomous weapons. We have heard a divergent array of views, with some viewing LAWS as weapons with more advanced targeting systems and a wider search area than missiles today, and others viewing them as thinking or learning machines with human-level cognition. There are significant differences between these viewpoints, with important consequences for understanding the risks and potential benefits of autonomous weapons.

The current lack of a common language for communicating about autonomous weapons makes this discussion challenging. We also lack a common framework for how to think about the potential implications of LAWS. From disagreements about whether autonomous weapons violate the public conscience provision of the Martens Clause to questions about why any state would want to develop or use an them in the first place, it is clear that the international community is still in the early stages of discussions on these issues. It is critical to avoid rushing to judgment.

Nevertheless, we have heard some common themes.

We have not heard any members suggest that some level of human control over the use of force is not needed. Rather, there seems to be an emerging consensus that human control and judgment is needed. And most seem to agree that there should be a necessary quality to that control, just as there is with the use of weapons today. Whether one uses the term meaningful, adequate, effective, or some other term, there seems to be agreement that human judgment is needed over the use of force.

And indeed, a world where humans are no longer involved in decisions about life and death is a disturbing future.

Humans have certain obligations under the laws of war and, as many pointed out, there are important moral and ethical considerations outside of IHL as well.

As we begin to understand how autonomy should be incorporated into future military systems, we should not lose sight of these obligations and considerations.

It is humans who are bound by the laws of war, who adhere to it or break it.

Machines are tools, and we should take care not to see them as agents. LAWS may be used in ways by humans that are unlawful or ways that are lawful, but LAWS themselves are not agents under the laws of war. We should not ask whether LAWS could make decisions about proportionality, distinction, and military necessity, complying with IHL. Rather, we should ask whether LAWS could be used – by humans – in ways that comply with IHL. These are human judgments. Human judgment is and will remain essential.

At the same time, we should not attempt to make blanket determinations about what can or cannot be done in the future based on the state of technology today. We heard this week from a leading researcher in artificial intelligence that computers are able to recognize over 1000 categories of objects and perform facial recognition better than humans. Is it so hard to imagine that computers might be able to recognize objects in war better than people, to distinguish between a person carrying a rifle and a person carrying a rake?

Therefore, we should not think of humans or LAWS as the choice before us. Instead, we should look to find ways to use autonomy to ensure that the use of force is more lawful, more discriminate, and enhances human accountability and responsibility.

Thank you.

  • Podcast
    • March 12, 2019
    CNAS Tech: How (Not) to Talk About AI & Lethality

    The U.S. Army recently announced its new Advanced Targeting & Lethality Automated System, or ATLAS program. The announcement generated concern and media headlines about the le...

    By Paul Scharre, Kara Frederick & Megan Lamberth

  • Video
    • September 18, 2018
    Will WWIII Be Fought By Robots?

    What will autonomous weapons mean for how future wars are waged and the loss of human lives in armed conflicts? That's the topic of a new book, Army of None: Autonomous Weapon...

    By Paul Scharre

  • Commentary
    • Foreign Policy
    • September 13, 2018
    A Million Mistakes a Second

    Militaries around the globe are racing to build ever more autonomous drones, missiles, and cyberweapons. Greater autonomy allows for faster reactions on the battlefield, an ad...

    By Paul Scharre

  • Commentary
    • NBC News
    • August 7, 2018
    Six arrested after Venezuelan president dodges apparent assassination attempt

    Venezuelan President Nicolas Maduro was speaking at a military event when a drone carrying plastic explosives detonated on Saturday. CNAS Technology and National Security Dire...

    By Paul Scharre

View All Reports View All Articles & Multimedia