Over 100 CEOs of artificial intelligence and robotics firms recently signed an open letter warning that their work could be repurposed to build lethal autonomous weapons — “killer robots.” They argued that to build such weapons would be to open a “Pandora’s Box.” This could forever alter war.
Over 30 countries have or are developing armed drones, and with each successive generation, drones have more autonomy. Automation has long been used in weapons to help identify targets and maneuver missiles. But to date, humans have remained in control of deciding whether to use lethal force. Militaries have only used automated engagements in limited settings to defend against high-speed rockets and missiles. Advances in autonomous technology could change that. The same intelligence that allows self-driving cars to avoid pedestrians could allow future weapons that hunt and attack targets on their own.
Read the full op-ed in TIME.
More from CNAS
TranscriptTranscript from U.S. AI Strategy Event: “The American AI Century: A Blueprint for Action”
On January 10, the CNAS Technology and National Security Program hosted a major U.S. AI strategy event. We are pleased to share the transcript of the presentations and panel d...
By Robert O. Work, Paul Scharre, Martijn Rasser, Megan Lamberth, Ainikki Riikonen, Dr. Lynne Parker & Olivia Zetter
PodcastEpisode 26 - Paul Scharre
What are autonomous weapons systems? How are they used in modern warfare? And how do we strengthen international cooperation? In this episode, the Director of the Technology a...
By Paul Scharre
PodcastRobots That Kill
By Paul Scharre
CommentaryThe AI Literacy Gap Hobbling American Officialdom
Rarely is there as much agreement about the importance of an emerging technology as exists today about artificial intelligence (AI). Rightly or wrongly, a 2019 survey of 1,000...
By Michael Horowitz & Lauren Kahn