March 10, 2025

The United States Must Avoid AI’s Chernobyl Moment

In January, U.S President Donald Trump tasked his advisors to develop by July 2025 an AI Action Plan, a roadmap intended to “sustain and enhance America’s AI dominance.” This call to action mirrors the early days of nuclear energy — a transformative technology with world-changing potential but also grave risks. Much like the nuclear industry was derailed by public backlash following disasters such as Three Mile Island and Chernobyl, AI could face a similar crisis of confidence unless policymakers take proactive steps to prevent a large-scale incident.

A single large-scale AI disaster—be it in cybersecurity, critical infrastructure, or biotechnology—could undermine public trust, stall innovation, and leave the United States trailing global competitors. Recent reports indicate plans to cut the government’s AI capacity by dismantling the AI Safety Institute. But this would be a self-inflicted wound—not only for safety, but for progress. If Washington fails to anticipate and mitigate major AI risks, the United States risks falling behind in the fallout from what could become AI’s Chernobyl moment.

The United States cannot let speculative fears trigger heavy-handed regulations that would cripple U.S. AI innovation.

For many Americans, AI’s transformative promise today echoes the optimism around nuclear power in the early 1970s, when more than half of the public supported its expansion. Yet the 1979 accident at Three Mile Island—a partial reactor meltdown—shattered that optimism, with support for nuclear energy dropping precipitously by the mid-1980s. By 1984, nearly two-thirds of Americans opposed the expansion of nuclear energy. Statistical analysis suggests that the Three Mile Island incident was associated with a 72 percent decline in nuclear reactor construction globally. Following the deadlier 1986 Chernobyl incident, countries were more than 90 percent less likely to build nuclear power plants than prior to this accident.

Just as many nations envisioned a renaissance for nuclear energy, the 2011 Fukushima disaster in Japan triggered renewed public skepticism and policy reversals. Fukushima — the only nuclear disaster besides Chernobyl to ever reach the highest classification on the International Nuclear and Radiological Event Scale — caused public support for nuclear energy to plummet around the world. The Japanese government halted all plans for new nuclear reactors. Germany shut down all 17 of its nuclear power generation facilities, ultimately leading to increased dependence on Russian fossil fuels, compromising both its energy security and climate goals. The world is still paying the opportunity cost today: Limited access to clean, reliable nuclear power remains a critical bottleneck for AI development and other energy-intensive innovations.

Read the full article on Just Security.

  • Commentary
    • May 20, 2025
    Artificial Intelligence Infrastructure on DOE Lands

    Maintaining America’s lead in AI data centers is critical for U.S. AI dominance....

    By Caleb Withers, Janet Egan & Spencer Michaels

  • Reports
    • May 8, 2025
    Lessons in Learning

    Executive Summary Although claims of a revolution in military affairs may be overhyped, the potential for artificial intelligence (AI) and autonomy to change warfare is growin...

    By Josh Wallin

  • Commentary
    • April 29, 2025
    Five Objectives to Guide U.S. AI Diffusion

    The Framework for AI Diffusion (the Framework) is an ambitious proposal to shape the global distribution of critical AI capabilities, maintain U.S. AI leadership, and prevent ...

    By Janet Egan & Spencer Michaels

  • Commentary
    • Just Security
    • April 25, 2025
    Shaping the World’s AI Future: How the U.S. and China Compete to Promote Their Digital Visions

    As the United States navigates evolving global AI competition, balancing these elements will be crucial in determining whose AI systems — and by extension, whose approaches, v...

    By Keegan McBride

View All Reports View All Articles & Multimedia