People can differ on their perceptions of "evil." People can also change their minds. Still, it's hard to wrap one's head around how Google, famous for its "don't be evil" company motto, dealt with a small Defense Department contract involving artificial intelligence.
Facing a backlash from employees, including an open letter insisting the company "should not be in the business of war," Google in April grandly defended involvement in a project "intended to save lives and save people from having to do highly tedious work."
Less than two months later, chief executive officer Sundar Pichai announced that the contract would not be renewed, writing equally grandly that Google would shun AI applications for "weapons or other technologies whose principal purpose or implementation is to cause or directly facilitate injury to people.”
To the surprise of exactly nobody familiar with Silicon Valley's flexible ethics, he was quick to add that Google "will continue our work with governments and the military in many other areas" including cybersecurity, training and military recruitment. Because we all know that military training has nothing whatsoever to do with facilitating injuries to people.
Google's moral posturing aside, the brouhaha over Project Maven does raise a whole lot of important questions over what defense, national security and law-enforcement applications of artificial intelligence will mean for humanity in the near and distant futures. So I decided to pose some of them to somebody who's been giving the whole thing deep thought: Paul Scharre, author of a new book, "Army of None: Autonomous Weapons and the Future of War."
Scharre, a former Army ranger who was deployed to Afghanistan and Iraq, is now the director of the technology and national security program at the Center for a New American Security, a Washington think-tank founded by some heavy hitters from the Obama administration's Defense and State Departments. Here is a lightly edited transcript of our discussion:
Tobin Harshaw: Let's start with the specific then move to the general. Many people know that Google decided not to renew its contract with the Pentagon on Project Maven. Very few people probably know what Project Maven is. Can you briefly describe it, and explain how AI -- machine learning -- factors into it?
Paul Scharre: The essence is using artificial intelligence to better process drone imagery so the people can understand it. In the public imagination, drones are often synonymous with drone strikes. For the military, the real value that drones bring to the table is their ability to do persistent surveillance. Most of the time they're doing reconnaissance missions -- just watching -- and they're following people and mapping terrorist networks and scooping up volumes of data that are very hard for humans to process.
Read the full interview at Bloomberg
More from CNAS
ReportsRising to the China Challenge
An independent assessment for Congress as mandated by the FY2019 National Defense Authorization Act....
By Ely Ratner, Daniel Kliman, Susanna V. Blume, Rush Doshi, Chris Dougherty, Richard Fontaine, Peter Harrell, Martijn Rasser, Elizabeth Rosenberg, Eric Sayers, Daleep Singh, Paul Scharre, Loren DeJonge Schulman, Neil Bhatiya, Ashley Feng, Joshua Fitt, Megan Lamberth, Kristine Lee & Ainikki Riikonen
PodcastIran Conflict Could Shift To Cyberspace, Experts Warn
Hackers linked to Iran are probing American companies for vulnerabilities, cybersecurity researchers and U.S. government officials say. The warnings suggest that the next pha...
By Kara Frederick
PodcastStories from the Backchannel: Season Two Trailer
Now more than ever, Americans are interested in the people working behind the scenes on consequential national security decisions. In Season Two of Stories from the Backchanne...
By Ilan Goldenberg, Richard Fontaine, Susanna V. Blume, Kayla M. Williams, Price B. Floyd, Kurt Campbell & Kara Frederick
TranscriptTranscript from U.S. AI Strategy Event: “The American AI Century: A Blueprint for Action”
On January 10, the CNAS Technology and National Security Program hosted a major U.S. AI strategy event. We are pleased to share the transcript of the presentations and panel d...
By Robert O. Work, Paul Scharre, Martijn Rasser, Megan Lamberth, Ainikki Riikonen, Dr. Lynne Parker & Olivia Zetter