July 20, 2018

Leading AI researchers vow to not develop autonomous weapons

Source: CNN

Journalist: Matt McFarland

In a letter published online, 2,400 researchers in 36 countries joined 160 organizations in calling for a global ban on lethal autonomous weapons. Such systems pose a grave threat to humanity and have no place in the world, they argue.

"We would really like to ensure that the overall impact of the technology is positive and not leading to a terrible arms race, or a dystopian future with robots flying around killing everybody," said Anthony Aguirre, who teaches physics at the University of California-Santa Cruz and signed the letter.

Flying killer robots and weapons that think for themselves remain largely the stuff of science fiction, but advances in computer vision, image processing, and machine learning make them all but inevitable. The Pentagon recently released a national defense strategy calling for greater investment in artificial intelligence, which the Defense Department and think tanks like the Center for a New American Security consider the future of warfare.

"Emerging technologies such as AI offer the potential to improve our ability to deter war and enhance the protection of civilians in the form of fewer civilian casualties and less collateral damage to civilian infrastructure," Pentagon spokesperson Michelle Baldanza said in a statement to CNNMoney.

"This initiative highlights the need for robust dialogue among [the Department of Defense], the AI research community, ethicists, social scientists, impacted communities, etc. and having early, open discussions on ethics and safety in AI development and usage."

Although the US holds the advantage in this field, China is catching up. Other countries are gaining ground as well. Israel, for example, has sold fully autonomous drones capable of attacking radar installation to China, Chile, India, and other countries.

The development of artificially intelligent weapons surely will continue despite the opposition of leading researchers such as Demis Hassabis and Yoshua Bengio and premier laboratories like DeepMind Technologies and Element AI. Their refusal to "participate in [or] support the development, manufacture, trade, or use" of autonomous killing machines amplifies similar calls by others, but may be largely symbolic.



Read the Full Article the CNN

Author

  • Paul Scharre

    Executive Vice President and Director of Studies

    Paul Scharre is the Executive Vice President and Director of Studies at CNAS. He is the award-winning author of Four Battlegrounds: Power in the Age of Artificial Intelligence...