Over 100 CEOs of artificial intelligence and robotics firms recently signed an open letter warning that their work could be repurposed to build lethal autonomous weapons — “killer robots.” They argued that to build such weapons would be to open a “Pandora’s Box.” This could forever alter war.
Over 30 countries have or are developing armed drones, and with each successive generation, drones have more autonomy. Automation has long been used in weapons to help identify targets and maneuver missiles. But to date, humans have remained in control of deciding whether to use lethal force. Militaries have only used automated engagements in limited settings to defend against high-speed rockets and missiles. Advances in autonomous technology could change that. The same intelligence that allows self-driving cars to avoid pedestrians could allow future weapons that hunt and attack targets on their own.
Read the full op-ed in TIME.
More from CNAS
CommentaryIn Search of Ideas: The National Security Commission on Artificial Intelligence Wants You
Americans don’t want to grow old wondering what happened to their country’s place in the world. U.S. global leadership has fostered international institutions, strengthened hu...
By Robert O. Work & Eric Schmidt
PodcastWhat Could Possibly Go Wrong?
War has been a driver of breakthrough technology for a long time. The first waves of artificial intelligence and even the internet came out of DARPA, a defense agency whose or...
By Paul Scharre, Richard Danzig, Arati Prabhakar & Jonathan Wilson
Video2019 Drell Lecture: Autonomous Weapons and the Future of War
On April 30, 2019, CNAS Technology and National Security Program Director Paul Scharre delivered the 2019 Drell Lecture on the campus of Stanford University. Scharre's remarks...
By Paul Scharre
The nation that leads in the development of artificial intelligence will, Russian President Vladimir Putin proclaimed in 2017, “become the ruler of the world.” That view has b...
By Paul Scharre