September 16, 2024
Regulating AI Is Easier Than You Think
Artificial intelligence is poised to deliver tremendous benefits to society. But, as many have pointed out, it could also bring unprecedented new horrors. As a general-purpose technology, the same tools that will advance scientific discovery could also be used to develop cyber, chemical, or biological weapons. Governing AI will require widely sharing its benefits while keeping the most powerful AI out of the hands of bad actors. The good news is that there is already a template on how to do just that.
In the 20th century, nations built international institutions to allow the spread of peaceful nuclear energy but slow nuclear weapons proliferation by controlling access to the raw materials—namely weapons-grade uranium and plutonium—that underpins them. The risk has been managed through international institutions, such as the Nuclear Non-Proliferation Treaty and International Atomic Energy Agency. Today, 32 nations operate nuclear power plants, which collectively provide 10% of the world’s electricity, and only nine countries possess nuclear weapons.
The U.S. can work with other nations to build on this foundation to put in place a structure to govern computing hardware across the entire lifecycle of an AI model: chip-making equipment, chips, data centers, training AI models, and the trained models that are the result of this production cycle.
Countries can do something similar for AI today. They can regulate AI from the ground up by controlling access to the highly specialized chips that are needed to train the world’s most advanced AI models. Business leaders and even the U.N. Secretary-General António Guterres have called for an international governance framework for AI similar to that for nuclear technology.
The most advanced AI systems are trained on tens of thousands of highly specialized computer chips. These chips are housed in massive data centers where they churn on data for months to train the most capable AI models. These advanced chips are difficult to produce, the supply chain is tightly controlled, and large numbers of them are needed to train AI models.
Read the full article from TIME.
More from CNAS
-
Energy, Economics & Security / Technology & National Security
Beyond Bans: Expanding the Policy Options for Tech-Security ThreatsStuck between a rock (the fact that banning all Chinese tech that poses a risk is expensive and impractical) and a hard place (the fact that many existing mitigation proposals...
By Geoffrey Gertz
-
Indo-Pacific Security / Technology & National Security
Cyber Crossroads in the Indo-PacificThe Indo-Pacific faces a cyber crossroads. Down one path lies deeper military, intelligence, and economic ties between Washington and its key allies and partners in this strat...
By Vivek Chilukuri, Lisa Curtis, Janet Egan, Morgan Peirce, Elizabeth Whatcott & Nathaniel Schochet
-
Technology & National Security
Securing America’s AI Future: Federal Research and Development PrioritiesOn April 29, 2025, the White House Office of Science and Technology Policy (OSTP) issued a Request for Information on the Development of a 2025 National Artificial Intelligenc...
By Caleb Withers & Spencer Michaels
-
Middle East Security / Technology & National Security
‘We Want Peace’: How Attacks Between Israel and Iran Could Impact People in NCRetired Lt. Gen. Jack Shanahan is an adjunct senior fellow at the Center for New American Security. Shanahan provided some context on how the two Middle East countries got her...
By Lt. Gen. Jack Shanahan