Over 100 CEOs of artificial intelligence and robotics firms recently signed an open letter warning that their work could be repurposed to build lethal autonomous weapons — “killer robots.” They argued that to build such weapons would be to open a “Pandora’s Box.” This could forever alter war.
Over 30 countries have or are developing armed drones, and with each successive generation, drones have more autonomy. Automation has long been used in weapons to help identify targets and maneuver missiles. But to date, humans have remained in control of deciding whether to use lethal force. Militaries have only used automated engagements in limited settings to defend against high-speed rockets and missiles. Advances in autonomous technology could change that. The same intelligence that allows self-driving cars to avoid pedestrians could allow future weapons that hunt and attack targets on their own.
Read the full op-ed in TIME.
More from CNAS
CommentaryThe China Challenge
The United States and China are strategic competitors, and technology is at the center of this competition, critical to economic strength and national security. The United Sta...
By Martijn Rasser, Elizabeth Rosenberg & Paul Scharre
ReportsThe American AI Century: A Blueprint for Action
This is a preview of the forthcoming report American AI Century. The full study, which contains in-depth analysis of the issues and details of the corresponding policy recomme...
By Martijn Rasser, Megan Lamberth, Ainikki Riikonen, Chelsea Guo, Michael Horowitz & Paul Scharre
VideoImplementing AI ethics standards at the DoD
Robert Work, former Deputy Secretary of Defense, discusses the need for ethical AI standards in government, and why it’s important that AI usage reflects our values.Watch the ...
By Robert O. Work
CommentaryPreparing the Military for a Role on an Artificial Intelligence Battlefield
The Defense Innovation Board—an advisory committee of tech executives, scholars, and technologists—has unveiled its list of ethical principles for artificial intelligence (AI)...
By Megan Lamberth