May 30, 2018

Technology Roulette

Managing Loss of Control as Many Militaries Pursue Technological Superiority

By Richard Danzig

Executive Summary

This report recognizes the imperatives that inspire the U.S. military’s pursuit of technological superiority over all potential adversaries. These pages emphasize, however, that superiority is not synonymous with security. Experience with nuclear weapons, aviation, and digital information systems should inform discussion about current efforts to control artificial intelligence (AI), synthetic biology, and autonomous systems. In this light, the most reasonable expectation is that the introduction of complex, opaque, novel, and interactive technologies will produce accidents, emergent effects, and sabotage. In sum, on a number of occasions and in a number of ways, the American national security establishment will lose control of what it creates. 

A strong justification for our pursuit of technological superiority is that this superiority will enhance our deterrent power. But deterrence is a strategy for reducing attacks, not accidents; it discourages malevolence, not inadvertence. In fact, technological proliferation almost invariably closely follows technological innovation. Our risks from resulting growth in the number and complexity of interactions are amplified by the fact that proliferation places great destructive power in the hands of others whose safety priorities and standards are likely to be less ambitious and less well funded than ours. 

Accordingly, progress toward our primary goal, superiority, should be expected to increase rather than reduce collateral risks of loss of control. This report contends that, unfortunately, we cannot reliably estimate the resulting risks. Worse, there are no apparent paths for eliminating them or even keeping them from increasing. The benefit of an often referenced recourse, keeping “humans in the loop” of operations involving new technologies, appears on inspection to be of little and declining benefit. 

We are not, however, impotent. With more attention the American military at least can dampen the likely increase in accidents and moderate adverse consequences when they occur. Presuming that the United States will be a victim of accidents, emergent effects, and sabotage, America should improve its planning for coping with these consequences. This will involve reallocating some of the immense energy our military invests in preparing for malevolence to planning for situations arising from inadvertent actions and interactions. 

The U.S. Department of Defense and intelligence agencies also must design technologies and systems not just for efficacy and efficiency, but also with more attention to attributes that can mitigate the consequences of failure and facilitate resilient recovery. The pervasive insecurity of digital information systems should be an object demonstration that it is exceedingly costly, perhaps impossible, to attempt to counter loss of control after we have become dependent on a new technology, rather than at the time of design. 

Most demandingly, the United States also must work with its opponents to facilitate their control and minimize their risks of accidents. Twenty-first century technologies are global not just in their distribution, but also in their consequences. Pathogens, AI systems, computer viruses, and radiation that others may accidentally release could become as much our problem as theirs. Agreed reporting systems, shared controls, common contingency plans, norms, and treaties must be pursued as means of moderating our numerous mutual risks. The difficulty of taking these important steps should remind us that our greatest challenges are not in constructing our relationships to technologies, it is in constructing our relationships with each other. 

These arguments are made to the national security community. These reflections and recommendations, however, should transcend their particulars and have implications for all discussion about control of dangerous new technologies. 

The full report is available online.

Download PDF


  • Richard Danzig

    Member, CNAS Board of Directors, Senior Advisor, Johns Hopkins Applied Physics Laboratory

    Richard Danzig’s primary activity is as a consultant to U.S. Intelligence Agencies and the Department of Defense on national security issues. He is a Senior Advisor to the Joh...

  • Commentary
    • The Jamestown Foundation
    • April 1, 2020
    Global Supply Chains, Economic Decoupling, and U.S.-China Relations, Part 1: The View from the United States

    The trade war has defined the current adversarial relationship between the United States and the People’s Republic of China (PRC). While President Donald J. Trump has at times...

    By Sagatom Saha & Ashley Feng

  • Commentary
    • The Wall Street Journal
    • March 27, 2020
    Health Surveillance Is Here to Stay

    Washington’s post-9/11 debate about how much surveillance a free society should allow has suddenly become about much more than counterterrorism and national security. Amid tod...

    By Carrie Cordero & Richard Fontaine

  • Commentary
    • March 27, 2020
    Sharper: Global Coronavirus Response

    As regions across the United States enforce states of emergency and a growing list of countries restrict travel, close schools, and quarantine citizens, the economic and human...

    By Chris Estep & Cole Stevens

  • Commentary
    • Global Digital Finance
    • March 27, 2020
    Banks Are Most Likely Exposed to Crypto-Assets Unknowingly

    U.S. financial regulators are watching closely to see how financial institutions’ exposure to the crypto-asset industry is affecting their bank anti-money laundering complianc...

    By Yaya J. Fanusie

View All Reports View All Articles & Multimedia