November 09, 2017

The Lethal Autonomous Weapons Governmental Meeting (Part I: Coping with Rapid Technological Change)

By Paul Scharre

This week nations meet at the United Nations to discuss lethal autonomous weapon systems (LAWS), including robotic weapons that might hunt for targets on their own. It has been 18 months since meetings were last held. This year’s discussions, the fourth since 2014, mark the first time talks will be held as a Group of Governmental Experts (GGE), a more formal discussion format than earlier talks. While the shift to a more formal format might seem like progress toward reaching an international consensus on what to do about autonomous weapons, the reality is that the pace of diplomacy continues to fall far behind the speed of technological advancement. Those advancements include major capabilities but also newly discovered limits in autonomy and artificial intelligence.

When nations first began discussing autonomous weapons in 2014, the issue was fairly forward-leaning. Lethal robotic weapons seemed like a distant future problem (even though simple versions had been used in limited ways for decades). In the years since, however, the field of artificial intelligence and machine learning has grown by leaps and bounds. Powered by advances in big data, computer processing power, and improvements in algorithms, AI-enabled systems are now beginning to tackle many problems that have been intractable for decades. AI systems have beaten humans at poker and the Chinese strategy game Go, including most recently reaching super-human level play at Go in a mere three days of self-practice with no human training data. AI systems can translatelanguages and transcribe speech. Self-driving cars are taking to the roads. Nation-states have deployed armies of Twitter bots to push propaganda. AI is being applied to medicine, finance, media, and many other industries. Our lives are increasingly influenced by algorithms. What might have seemed like science fiction when nations began talks only a few years ago is fast becoming the everyday.

Read the full commentary in Just Security.

  • Podcast
    • March 12, 2019
    CNAS Tech: How (Not) to Talk About AI & Lethality

    The U.S. Army recently announced its new Advanced Targeting & Lethality Automated System, or ATLAS program. The announcement generated concern and media headlines about the le...

    By Paul Scharre, Kara Frederick & Megan Lamberth

  • Video
    • September 18, 2018
    Will WWIII Be Fought By Robots?

    What will autonomous weapons mean for how future wars are waged and the loss of human lives in armed conflicts? That's the topic of a new book, Army of None: Autonomous Weapon...

    By Paul Scharre

  • Commentary
    • Foreign Policy
    • September 13, 2018
    A Million Mistakes a Second

    Militaries around the globe are racing to build ever more autonomous drones, missiles, and cyberweapons. Greater autonomy allows for faster reactions on the battlefield, an ad...

    By Paul Scharre

  • Commentary
    • NBC News
    • August 7, 2018
    Six arrested after Venezuelan president dodges apparent assassination attempt

    Venezuelan President Nicolas Maduro was speaking at a military event when a drone carrying plastic explosives detonated on Saturday. CNAS Technology and National Security Dire...

    By Paul Scharre

View All Reports View All Articles & Multimedia