In October 2022, the U.S. Department of Defense released its National Defense Strategy, which included a Nuclear Posture Review. Notably, the department committed to always maintain human control over nuclear weapons: “In all cases, the United States will maintain a human ‘in the loop’ for all actions critical to informing and executing decisions by the President to initiate and terminate nuclear weapon employment.”
AI-enabled nuclear weapons are particularly concerning due to their civilization-destroying nature.
This commitment is a valuable first step that other nuclear powers should follow. Still, it is not enough. Commitments like these are time and circumstance dependent. The U.S. military does not currently feel the need to produce and deploy such weapons, in part because it does not see other nuclear powers engaging in similar behavior. Thus, the threat of an artificial intelligence (AI)-enabled arms race is not a high-level concern for military planners. In the future, emerging AI features will only increase the potential for disaster through the possibility of semiautonomous or fully autonomous nuclear weapons.
Read the full article from Lawfare.
More from CNAS
Obstacles and Opportunities for Transformative Change
By Paul Scharre
Every Country Is on Its Own on AI
But establishing such an institution quickly enough to match AI’s accelerating progress is likely a pipe dream, given the history of nuclear arms controls and their status tod...
By Bill Drexel & Michael Depp
The Time to Regulate AI Is Now
Policymakers should also be under no illusion that a light regulatory touch will somehow prevent a degree of concentration at AI’s frontier....
By Caleb Withers
Is an AI arms race underway?
The role of artificial intelligence has long been debated in military communities. But as recent leapfrog advancements in technology have garnered headlines about the effects ...
By Paul Scharre