February 29, 2024

The Perilous Coming Age of AI Warfare

Last year, the Ukrainian drone company Saker claimed it had fielded a fully autonomous weapon, the Saker Scout, which uses artificial intelligence to make its own decisions about who to kill on the battlefield. The drone, Saker officials declared, had carried out autonomous attacks on a small scale. Although this has not been independently verified, the technology necessary to create such a weapon certainly exists. It is a small technical step—but a consequential moral, legal, and ethical one—to then produce fully autonomous weapons that are capable of searching out and selecting targets on their own.

The end state of this competition will likely be war executed at machine speed and beyond human control.

The deployment of Saker’s drone shows that the window to regulate autonomous weapons is closing fast. Countries have been discussing what to do about autonomous weapons for a decade, but they have been unable to agree on regulations to limit the weapons’ harms. Yet there is an urgent need for international agreement. The unconstrained development of autonomous weapons could lead to wars that expand beyond human control, with fewer protections for both combatants and civilians. Even if a wholesale ban is not realistic, there are many practical regulations that governments can adopt to mitigate autonomous weapons’ worst dangers. Without limits, humanity risks barreling toward a future of dangerous, machine-driven warfare.

Read the full article from Foreign Affairs.

View All Reports View All Articles & Multimedia