September 12, 2018

The Algorithms of August

By Michael Horowitz

An artificial intelligence arms race is coming. It is unlikely to play out in the way that the mainstream media suggest, however: as a faceoff between the United States and China. That’s because AI differs from the technologies, such as nuclear weapons and battleships, that have been the subject of arms races in the past. After all, AI is software—not hardware.

Because AI is a general purpose technology—more like the combustion engine or electricity than a weapon—the competition to develop it will be broad, and the line between its civilian and military uses will be blurry. There will not be one exclusively military AI arms race. There will instead be many AI arms races, as countries (and, sometimes, violent nonstate actors) develop new algorithms or apply private sector algorithms to help them accomplish particular tasks.

In North America, the private sector invested some $15 billion to $23 billion in AI in 2016, according to a McKinsey Global Institute report. That’s more than 10 times what the U.S. government spent on unclassified AI programs that same year. The largest share came from companies such as Google and Microsoft, as well as a number of smaller private firms, not from government-funded defense research. This reverses the dynamic from the Cold War, when government investments led to private sector innovation and produced technologies such as GPS and the internet.

China says it already holds more than 20 percent of patents in the field and plans to build its AI sector to be worth $150 billion by 2030. But while Beijing and Washington are the current leaders in this race, they are not the only competitors. Countries around the world with advanced technology sectors, from Canada to France to Singapore, also have the potential to make great strides in AI (or build on lower-level advances made by others). While this diffusion means that many more countries will have a stake in the regulation of AI, it also means that many more governments will have incentives to go it on their own.


Read the Full Article at Foreign Policy

  • Podcast
    • March 16, 2020
    The Cyberlaw Podcast: The (Almost) COVID-19-Free Episode

    If your podcast feed has suddenly become a steady diet of more or less the same COVID-19 stories, here’s a chance to listen to cyber experts talk about what they know about – ...

    By Elsa B. Kania

  • Commentary
    • Slate
    • February 19, 2020
    Faux News Articles and Social Media Posts Will Haunt This Election

    Last September, an image of a New York Times headline began circulating online, claiming that Abdullah Abdullah, a candidate for the Afghan presidency, had taken millions of d...

    By Chris Estep & Megan Lamberth

  • Commentary
    • Council on Foreign Relations
    • February 12, 2020
    The Dangers of Manipulated Media in the Midst of a Crisis

    In the immediate aftermath of the U.S. drone strike that killed Iranian General Qasem Soleimani, the internet was flooded with purportedly real-time information about the circ...

    By Megan Lamberth

  • Commentary
    • Defense One
    • January 28, 2020
    Great Powers Must Talk to Each Other About AI

    Imagine an underwater drone armed with nuclear warheads and capable of operating autonomously. Now imagine that drone has lost its way and wandered into another state’s territ...

    By Elsa B. Kania & Dr. Andrew Imbrie

View All Reports View All Articles & Multimedia