September 16, 2024
Regulating AI Is Easier Than You Think
Artificial intelligence is poised to deliver tremendous benefits to society. But, as many have pointed out, it could also bring unprecedented new horrors. As a general-purpose technology, the same tools that will advance scientific discovery could also be used to develop cyber, chemical, or biological weapons. Governing AI will require widely sharing its benefits while keeping the most powerful AI out of the hands of bad actors. The good news is that there is already a template on how to do just that.
In the 20th century, nations built international institutions to allow the spread of peaceful nuclear energy but slow nuclear weapons proliferation by controlling access to the raw materials—namely weapons-grade uranium and plutonium—that underpins them. The risk has been managed through international institutions, such as the Nuclear Non-Proliferation Treaty and International Atomic Energy Agency. Today, 32 nations operate nuclear power plants, which collectively provide 10% of the world’s electricity, and only nine countries possess nuclear weapons.
The U.S. can work with other nations to build on this foundation to put in place a structure to govern computing hardware across the entire lifecycle of an AI model: chip-making equipment, chips, data centers, training AI models, and the trained models that are the result of this production cycle.
Countries can do something similar for AI today. They can regulate AI from the ground up by controlling access to the highly specialized chips that are needed to train the world’s most advanced AI models. Business leaders and even the U.N. Secretary-General António Guterres have called for an international governance framework for AI similar to that for nuclear technology.
The most advanced AI systems are trained on tens of thousands of highly specialized computer chips. These chips are housed in massive data centers where they churn on data for months to train the most capable AI models. These advanced chips are difficult to produce, the supply chain is tightly controlled, and large numbers of them are needed to train AI models.
Read the full article from TIME.
More from CNAS
-
Technology & National Security
The Sovereignty Gap in U.S. AI StatecraftThis article was originally published in Lawfare. As the India AI Impact Summit kicks off this week, the Trump administration has embraced the language of “sovereign AI.” Thro...
By Pablo Chavez
-
Technology & National Security
America’s Key to Biotechnology Leadership? AI-Ready Biodata.This article was originally published in Just Security. From strengthening armor for U.S. warfighters to patching supply chain vulnerabilities, the convergence of AI and biote...
By Sam Howell & Michelle Holko
-
Technology & National Security
The Rise of the Answer MachinesThis article was originally published in Financial Times. Every spring, I take red-eyes from Austin, Texas, to Oxford, England, to teach a graduate seminar on AI and philosoph...
By Brendan McCord
-
Technology & National Security
Selling H200s to China Erodes Main U.S. AdvantageA new report says China could buy twice as much AI computing power as it can produce domestically if Nvidia H200 chips are allowed there. Janet Egan from the Center for a New ...
By Janet Egan
