New CNAS Research on Autonomous Weapons
CNAS Adjunct Senior Fellow Michael C. Horowitz released two papers this week on issues surrounding autonomous weapons.
Published in the academic journal Research & Politics, this paper explores public attitudes about autonomous weapons and concern that mass public opposition means autonomous weapon systems violate the Martens Clause of the Hague Convention. The paper shows that context, including protection of troops and the relative effectiveness of the weapons, plays a crucial role in shaping public attitudes. Given the way context influences public attitudes, the paper concludes that there is a need for modesty when making claims about public perceptions concerning autonomous weapons.
This draft working paper focuses on discussions concerning the ethics and morality of lethal autonomous weapon systems (LAWS). It argues that the category of LAWS is so broad that it may make sense to analyze LAWS in three categories: munitions, platforms, and operations planning systems. The paper then uses that framework to assess ethical and moral debates about the controllability of LAWS, accountability and responsibility concerns, and human dignity.
Autonomous weapons in the news
(International Business Times) Vittorio Hernandez describes an Australian professor who, through analysis of military budgets, inferred various autonomous weapon technologies in development. These include biological systems and various stealth technologies for drones.
(The Washington Post) Matt McFarland discusses some of the dangers of autonomous weapons and the lack of regulations. He raises the issue of autonomous weapon proliferation and specifically the possibility of non-state actors acquiring these technologies. He argues current negotiations regarding regulations are moving at a “glacial” pace and must be stepped up.
(The Washington Post) David Ignatius reports on the recent Munich Security Conference. He discusses the increased likelihood of autonomous weapons and how it could influence future war. He goes on to discuss other security issues at the conference such as data security and non-state actors.
(The Monitor Daily) John Birks reports on a recent panel held by the World Economic Forum at Davos concerning a potential autonomous weapons ban. The panel discusses some of the ethical, moral, and legal challenges presented by autonomous weapons and put forth a call for a ban on autonomous weapons. The panel described it as a “preemptive” ban that would prevent autonomous weapons from becoming developed enough to be dangerous.
(Tech Times) Katherine Derla discusses a new technology that could “train” artificial intelligence to behave in social settings. Through reading children’s’ books the robots will take in and adopt rational human decisions in order to create some sort of moral reasoning. The technology, named Quixote, would “promote the options that will not cause harm to humans.”
(Defense One) Andrew Lohn, Andrew Parasiliti, and William Welser IV lay out what they see as the five main risks of autonomous weapons: control, hacking, targeting, mistakes, and liability. Despite these issues, they believe regulation would “maximize benefits while minimizing risk” of autonomous weapons. They argue that regulation would be more useful than the ban some tech titans (Elon Musk, Stephen Hawking, and Steven Wozniak) have called for.
(Vice News) Ryan Faith argues that a ban on autonomous weapons is redundant because of the various developments that have already dehumanized war such as homing cruise missiles or radar guided weapons. Similarly, Faith argues that the moral reasons for an autonomous weapons ban are not valid since there would still be a human “in the loop” giving the order to launch the system, just as the President launches a nuclear missile.
Special thanks to University of Pennsylvania researcher Carter Goodwin for pulling together this news roundup.
More from CNAS
PodcastCNAS Tech: How (Not) to Talk About AI & Lethality
The U.S. Army recently announced its new Advanced Targeting & Lethality Automated System, or ATLAS program. The announcement generated concern and media headlines about the le...
By Paul Scharre, Kara Frederick & Megan Lamberth
VideoWill WWIII Be Fought By Robots?
What will autonomous weapons mean for how future wars are waged and the loss of human lives in armed conflicts? That's the topic of a new book, Army of None: Autonomous Weapon...
By Paul Scharre
CommentaryA Million Mistakes a Second
Militaries around the globe are racing to build ever more autonomous drones, missiles, and cyberweapons. Greater autonomy allows for faster reactions on the battlefield, an ad...
By Paul Scharre
CommentarySix arrested after Venezuelan president dodges apparent assassination attempt
Venezuelan President Nicolas Maduro was speaking at a military event when a drone carrying plastic explosives detonated on Saturday. CNAS Technology and National Security Dire...
By Paul Scharre