Washington, January 27 – Following the high-level discussion at the World Economic Forum in Davos, Switzerland, on autonomous weapons, Paul Scharre, Center for a New American Security Senior Fellow and Director of the CNAS Ethical Autonomy project has written a new Press Note, “Autonomous Weapons Fears and the Davos World Economic Forum.” CNAS’ Ethical Autonomy project examines the legal, moral, ethical, and policy issues associated with autonomous weapons.
The full Press Note is below:
Last week, experts at the World Economic Forum in Davos, Switzerland, discussed the potential dangers of autonomous weapons. Unlike today’s drones, which are entirely controlled by humans, autonomous weapons in the future could potentially select and engage targets all on their own. What happens when a drone has as much autonomy as a self-driving car and is able to choose its own actions on the battlefield? Just as greater autonomy is being incorporated into a wide range of civilian applications, militaries are also incorporating more autonomous features into future weapons. What should the line be between human and machine decisionmaking?
Autonomous weapons raise serious legal, ethical, and safety challenges. While the laws of war do not inherently prohibit autonomous weapons, today it would be very challenging for autonomous weapons to comply with the laws of war except under narrow circumstances. Even if they could operate lawfully, however, autonomous weapons raise serious moral and ethical challenges. Is it right to give a machine the power over life and death? And finally, even if autonomous weapons would be legal and moral, they may be extremely dangerous. The consequences of a malfunction or enemy hacking of an autonomous weapon could be severe. For example, automated stock trading algorithms have contributed, in part, to multiple stock market flash crashes. Such a “flash war” started by autonomous weapons would be disastrous.
Scharre is available for interviews. To arrange an interview, please contact Neal Urwitz at email@example.com or 202-457-9409.