Over 100 CEOs of artificial intelligence and robotics firms recently signed an open letter warning that their work could be repurposed to build lethal autonomous weapons — “killer robots.” They argued that to build such weapons would be to open a “Pandora’s Box.” This could forever alter war.
Over 30 countries have or are developing armed drones, and with each successive generation, drones have more autonomy. Automation has long been used in weapons to help identify targets and maneuver missiles. But to date, humans have remained in control of deciding whether to use lethal force. Militaries have only used automated engagements in limited settings to defend against high-speed rockets and missiles. Advances in autonomous technology could change that. The same intelligence that allows self-driving cars to avoid pedestrians could allow future weapons that hunt and attack targets on their own.
Read the full op-ed in TIME.
More from CNAS
PodcastEpisode 26 - Paul Scharre
What are autonomous weapons systems? How are they used in modern warfare? And how do we strengthen international cooperation? In this episode, the Director of the Technology a...
By Paul Scharre
PodcastRobots That Kill
By Paul Scharre
TranscriptTranscript from CNAS Report Launch Event: "Securing Our 5G Future"
On November 7, the CNAS Technology and National Security Program hosted a launch event for the Securing Our 5G Future report. We are pleased to share the transcript of this ev...
By Martijn Rasser, Elsa B. Kania & Rob Strayer
PodcastCNAS Tech: How (Not) to Talk About AI & Lethality
The U.S. Army recently announced its new Advanced Targeting & Lethality Automated System, or ATLAS program. The announcement generated concern and media headlines about the le...
By Paul Scharre, Kara Frederick & Megan Lamberth