September 16, 2024
Regulating AI Is Easier Than You Think
Artificial intelligence is poised to deliver tremendous benefits to society. But, as many have pointed out, it could also bring unprecedented new horrors. As a general-purpose technology, the same tools that will advance scientific discovery could also be used to develop cyber, chemical, or biological weapons. Governing AI will require widely sharing its benefits while keeping the most powerful AI out of the hands of bad actors. The good news is that there is already a template on how to do just that.
In the 20th century, nations built international institutions to allow the spread of peaceful nuclear energy but slow nuclear weapons proliferation by controlling access to the raw materials—namely weapons-grade uranium and plutonium—that underpins them. The risk has been managed through international institutions, such as the Nuclear Non-Proliferation Treaty and International Atomic Energy Agency. Today, 32 nations operate nuclear power plants, which collectively provide 10% of the world’s electricity, and only nine countries possess nuclear weapons.
The U.S. can work with other nations to build on this foundation to put in place a structure to govern computing hardware across the entire lifecycle of an AI model: chip-making equipment, chips, data centers, training AI models, and the trained models that are the result of this production cycle.
Countries can do something similar for AI today. They can regulate AI from the ground up by controlling access to the highly specialized chips that are needed to train the world’s most advanced AI models. Business leaders and even the U.N. Secretary-General António Guterres have called for an international governance framework for AI similar to that for nuclear technology.
The most advanced AI systems are trained on tens of thousands of highly specialized computer chips. These chips are housed in massive data centers where they churn on data for months to train the most capable AI models. These advanced chips are difficult to produce, the supply chain is tightly controlled, and large numbers of them are needed to train AI models.
Read the full article from TIME.
More from CNAS
-
Lawfare Daily: Janet Egan and Lennart Heim on the AI Diffusion Rule
Janet Egan, Senior Fellow at the Center for a New American Security (CNAS) and Lennart Heim, an AI researcher at RAND, join Kevin Frazier, a Tarbell Fellow at Lawfare, to anal...
By Janet Egan
-
Biden’s Big Bet to Control AI Diffusion with Paul Scharre
AI expert Paul Scharre, executive vice president and director of studies at the Center for a New American Security, joins Emily and Geoff to dig into the Biden Administration’...
By Emily Kilcrease, Geoffrey Gertz & Paul Scharre
-
The Clock Is Ticking on TikTok with Potential Ban Coming This Weekend
The clock is ticking on a potential ban on TikTok. In April, Congress and President Biden gave the app’s Beijing-based parent company 270 days to find a new owner or face a sh...
By Carrie Cordero
-
Biopower
For policymakers, the question is not whether the biorevolution has transformative power, but which nation will responsibly harness that power...
By Vivek Chilukuri & Hannah Kelley