Army of None: A Review
Scharre, Paul 2018, Army of None: Autonomous Weapons and the Future of War, 1st edition, W. W. Norton & Company, New York
Well-researched and written in clear and lucid prose, Army of None presents a wealth of intriguing detail on the past, present and future of war. The technology intensive subject of AI and robotics as they apply to autonomous weapons is accurately covered in accessible language.
The book has six parts.
Part I, Robopocalypse Now, discusses the weaponization of swarm robotics and the notion of autonomy. Fundamental concepts commonly used in discussions about autonomous weapons: e.g. human in the loop, human on the loop and human off the loop systems are explained.
Part II, Building the Terminator, goes deeper into what autonomous weapons have been built historically and what autonomous weapons are under active development now.
Part III, Runaway Gun, describes the kind of errors that are associated with autonomous systems and in particular the vulnerabilities of certain kinds of AI have with image spoofing. It also highlights an as yet unanswered question: how you can test a “learning” system?
Part IV, Flash War, focuses on “the need for speed” with reference to financial trading systems. It explores the risks of “machine speed” in finance (“flash crashes”) and in war. Unlike in stock trading where systems which can impose a timeout suspending trade when stock prices move too abruptly as a result of algorithmic mayhem, in battle there are no timeouts.
Part V, The Fight to Ban Autonomous Weapons, discusses efforts being made to ban autonomous weapons. It also discusses the moral arguments against autonomous weapons.
Part VI, Averting Armageddon: The Weapon of Policy, introduces concepts such as “centaur warfighters” (human-robot teams) and has an informative survey of the mixed history of arms control. It poses the question are autonomous weapons inevitable?
Scharre argues for restraint but not a ban. Restraint, “the conscious choice to pull back from weapons that are too dangerous, too inhumane,” he says, “is what is needed today.” He argues that pieces of paper will not stop states building autonomous weapons if they really want to but that “a pell-mell race forward in autonomy, with no sense of where it leads us, benefits no one.”
“States,” he concludes, “must come together to develop an understanding of which uses of autonomy are appropriate and which go too far and surrender human judgement where it is needed in war.”
Perhaps the most impressive aspect of the book is the range of sources who have given Scharre on-record interviews. These include senior Pentagon figures such as former Deputy Secretary of Defense, Bob Work, and former Undersecretary of Defense, Frank Kendall, DARPA directors such as Bradley Tousley, Aegis commanders such as Captain Pete Galluch and academics such as Dr. John Hawley, an engineering psychologist, who advises the US Navy on achieving high reliability operation of Aegis. These interviews provide great insight as to current thinking on complex autonomous systems based on real world experience. For me the most eye-opening parts of the book were the sections covering Aegis and Patriot.
Another attractive feature of the book is Scharre’s ability to link his own military experience as a US Army Ranger in Afghanistan to the broader issues of moral responsibility in warfare.
Obviously Scharre is sympathetic to the Pentagon point of view but he gives those seeking to ban autonomous weapons a fair hearing. Figures such Steve Goose and Bonnie Docherty of Human Rights Watch, Australian philosopher, Rob Sparrow, a founding member of the International Committee for Robot Arms Control and Jody Williams, a co-founder of the Campaign to Stop Killer Robots, who shared a Nobel Peace Prize for her role in winning public and diplomatic support for the Ottawa Convention banning anti-personnel landmines, are quoted extensively.
He covers the three sides of the ethical, legal and policy arguments on autonomous weapons clearly. There are those who favour retaining existing IHL and say there is no need for any regulation specific to autonomous weapons (e.g. the UK). Others favour a ban on autonomous weapons (e.g. Brazil, China, Austria and the Holy See). Others favour some form of regulation specific to autonomous weapons.
Scharre covers the “mixed history of arms control” and applies this to the current debates on autonomous weapons that are ongoing at the United Nations in Geneva.
Historically, the success of a ban relies on three factors: perceived horribleness; perceived military utility and the number of cooperating actors required. Blinding lasers were relatively easy to ban having a high “ick” factor and limited military utility. Attempts to ban submarines and bombers between the World Wars however failed due the high utility of these weapons. Looking to the future, no one seriously disputes the very high potential military utility of autonomous weapons.
He expresses some concern that the campaign to ban autonomous weapons is being led by NGOs rather than by great powers. He is unimpressed, by and large, with those nations that have signed up for a ban. “What the countries who support a ban have in common is that they are not major military powers. … for most of these countries their support for a ban isn’t about protecting civilians, it’s an attempt to tie the hands of more powerful nations.”
On this point it should be noted that Army of None went to print before China declared its support for some kind of ban on April 13th, becoming the first of the five permanent members of the Security Council to do so. Also, one could argue the Ottawa Convention banning anti-personnel landmines was led by NGOs appealing directly to public opinion not by great powers.
Scharre remains sceptical that a ban on autonomous weapons will be agreed to in the short term. Nations are still struggling to agree on a “common lexicon” to describe autonomous weapons, he thinks. However, recent events have shown there is a general willingness to define such a lexicon. The recent AI report from the House of Lords, for example, admonished the UK Ministry of Defence for its very “non-standard” definition of an autonomous weapon. It so happens China’s definition is somewhat “non-standard” too. Even so, the NGOs were happy enough with China’s position to put them on their tally list as supporting a ban on autonomous weapons.
As Scharre makes very clear, “when the starting point for definitions is that some groups are calling for a ban on autonomous weapons then the definition of autonomous weapons instantly becomes fraught.”
Scharre does not support a complete ban on autonomous weapons. Nor does he support an ‘anything goes’ approach. He is firmly committed to the view that there are some decisions in war that should continue to be made by humans but realistic about the military imperatives driving nations towards autonomous weapons. So he thinks a line needs to be drawn between acceptable and unacceptable uses of autonomy. Much of the detail of the book explores the detail of autonomy and where and why this line between acceptable and unacceptable autonomy might be drawn.
Army of None is a must read for those in the armed services, defence analysts and policy makers. It is detailed without being dense and accessible without being simplistic. There is very little to criticize. My only complaint is that Scharre is a little dismissive of the ability of AI to make moral decisions. (Disclosure: my own research is about AI making moral decisions) but this is a minor criticism of what, overall, is a timely, fascinating and worthwhile book.
Sean Welsh (@sean_welsh77) is the author of Ethics and Security Automata: Policy and Technical Challenges of the Robotic Use of Force and a postgraduate student in Philosophy at the University of Canterbury. Prior to embarking on his PhD he wrote software for British Telecom, Telstra Australia, Fitch Ratings, James Cook University and Lumata.
This article was first published by the Williams Foundation on May 23, 2018 and is reprinted with their permission.
It was published in their Central Blue column.