July 29, 2016

Who’ll want artificially intelligent weapons? ISIS, democracies, or autocracies?

One of the biggest fears about the nexus of artificial intelligence and the military is that machine learning—a type of artificial intelligence that allows computers to learn from new data without being explicitly programmed—could spread rapidly in military systems, and even risk an arms race. And the alarm over the consequences of robots armed with weapons even extended to the recent use of a remotely piloted, rather than autonomous, bomb disposal robot retrofitted with an explosive by the Dallas Police Department in July 2016. That event triggered a wave of articles about the consequences of robots with weapons, especially when used outside the military.

Discussions of the military applications of robotics have tended to focus on the United States, largely because of America’s extensive use of uninhabited (also called unmanned) aerial vehicles,  or “drones,” to conduct surveillance and launch lethal strikes against suspected militants around the world. Yet limiting the discussion of military robotics to those developed by wealthy, democratic countries such as the United States may miss important underlying trends.

Read the full article on the Bulletin of the Atomic Scientists.

View All Reports View All Articles & Multimedia