February 10, 2025
How Can We Develop AI That Helps, Rather Than Harms, People?
In every technological revolution, we face a choice: build for freedom or watch as others build for control. With AI the stakes couldn’t be higher. It already mediates 20 per cent of our waking hours through smartphones, automated systems, and digital interfaces. Soon it will touch nearly every aspect of human existence. While AI promises to liberate us for higher pursuits by “extending the number of important operations which we can perform without thinking,” history – from the iron cage of Soviet bureaucracy to modern Chinese surveillance – serves as a stark warning that automation can just as easily erode our freedoms and condition us to passively accept social control.
Today’s debate about AI’s future is dominated by competing visions of control. Doomsayers, like some of those at this week’s AI Action Summit in France, advocate for strict controls (even “pauses” on all development) that would forfeit progress while inviting tyranny.
Read the full article on The Spectator.
More from CNAS
-
Five Objectives to Guide U.S. AI Diffusion
The Framework for AI Diffusion (the Framework) is an ambitious proposal to shape the global distribution of critical AI capabilities, maintain U.S. AI leadership, and prevent ...
By Janet Egan & Spencer Michaels
-
Shaping the World’s AI Future: How the U.S. and China Compete to Promote Their Digital Visions
As the United States navigates evolving global AI competition, balancing these elements will be crucial in determining whose AI systems — and by extension, whose approaches, v...
By Keegan McBride
-
America Can Still Lose the AI Race
China is poised to take advantage as countries begin to reduce their reliance on the American tech industry....
By Keegan McBride
-
Siliconsciousness: The AI Competition: Public Policy Strategies: Part 1
This episode comprises the first part of our special event, “The AI Competition: Public Policy Strategies”. The event, co-hosted by MIT Technology Review, brings together some...
By Dr. ED McGrady