January 25, 2017

CNAS Releases Report on Automation and the Patriot Air and Missile Defense System

By Neal Urwitz

The Center for a New American Security (CNAS) Future of Warfare Initiative has released a new report, “Patriot Wars: Automation and the Patriot Air and Missile Defense System.” In the report, Dr. John Hawley – an engineering psychologist with the U.S. Army Research Laboratory’s Human Research and Engineering Directorate – discusses how to apply lessons learned from previous attempts at human-machine integration for the prudent use of automated and near autonomous systems. Paul Scharre, Director of the CNAS Future of Warfare Initiative, wrote the forward to the report.

The views expressed in the report are the author’s own and do not necessarily reflect those of the United States Army. The full report can be found here.

Please find a podcast on the report with Paul Scharre here.

Please find report’s introduction below:

The use of automation in the modern workplace has had many consequences, both positive and negative, both intended and unintended. Automation in various forms is increasingly being used in a range of weapons systems such as the Army’s Patriot air and missile defense system. Moreover, it has become commonplace in aircraft flight control systems, and in prototype self-driving cars that have been traversing streets and highways for several years. Applications of automation in future weapons systems and related uses are expected to proliferate and grow in the years to come. Many observers are calling for a candid discussion of appropriate roles for automation in military systems. This is particularly true now that some of these systems are approaching the threshold for autonomous operations.

To some observers, the use of automation in many of the applications cited above is relatively new. These observers write about such developments as if they are recent, and as if we collectively do not have much experience with automation applied to the development of autonomous or near-autonomous systems. That’s not altogether true. Some potential applications of automation technology, like self-driving cars, are relatively new, but other applications, such as near-autonomous air and missile defense systems or extensive flight deck automation in aircraft, have been around for quite some time. Moreover, we have a fair amount of operational experience with existing systems, and that experience has not all been positive. When I read the descriptive literature and claims for some of the newer applications of automation, such as self-driving cars, I find myself wondering whether their proponents either are not aware of our history with these older systems, or tend to view experiences with older systems as not relevant to their “new” and more advanced uses of this technology. Perhaps the idea is, “We’re better now, and that old stuff doesn’t apply.” It is true that automation technology is getting better, but the latter assertion is not necessarily true. There are lessons and pitfalls associated with the use of automation in older systems that apply directly to what can be expected with newer applications. A number of these lessons apply to the humans’ residual role in system control, and how difficult that role can be to prepare for and to perform.

This report is a mostly personal story. I have been in the somewhat unique position of having had a long-term, hands-on association with an early application of automation in weapon system control. The application in question is the Patriot air and missile defense system. The next portion of this paper traces my personal history with Patriot going back more than 35 years. During this time, my views regarding automation and autonomy have evolved considerably, based on extended hands-on experience with that system. I’ll state upfront that I’m not as optimistic regarding the safe and effective use of automated and near-autonomous systems as I once was. In this respect, the paper also outlines a number of lessons and cautions derived from my experiences with Patriot. I think these apply to many of the potential applications of automation technology currently being discussed. They go beyond the technology employed and also apply to the personnel and organizations charged with safely and reliably using that technology. In fact, the technology component may be the easiest of all to address. I have observed first hand that human aspects of automation are often the most difficult to resolve.

Scharre is available for interviews. To arrange an interview, please contact Neal Urwitz at nurwitz@cnas.org or 202-457-9409.

  • Neal Urwitz

    Director of External Relations

    Neal Urwitz is the Director of External Relations at the Center for a New American Security (CNAS). At CNAS, Mr. Urwitz is responsible for the organization’s media relations, ...