Book review: Wired for War.

WiredForWar.jpg
Wired for War: The Robotics Revolution and Conflict in the 21st Century
by P.W. Singer
New York: Penguin
2009

For some reason, collectively humans seem to have a hard time seeing around corners to anticipate the shape our future will take. Of those of us who remember email as a newish thing, I suspect most of us had no idea how much of our waking lives would come to be consumed by it. And surely I am not the only one who attended a lab meeting in which a visiting scholar mentioned a speculative project to build something called the World Wide Web and wondered aloud whether anything would come of it.
In the realm of foreign conflicts, our shared expectations also seem to land some distance from reality, as missions that are declared “accomplished” (or all but) stretch on with no clear end in sight.
In Wired for War, Brookings Institution senior fellow P.W. SInger asks us to try to peer around some important corners to anticipate the future of robotics in our conflicts and in our lives more broadly. The consequences of not doing so, he warns, may have significant impacts on policy, law, ethics, and our understanding of ourselves and our relation to our fellow humans.


Singer starts with where robots are now, considering the ways the U.S. military currently uses robots to clear improvised explosive devices (IEDs) and to conduct surveillance in Iraq and Afghanistan. He talks to the scientists and engineers at the companies developing these robots (including iRobot, the maker of the Roomba and Scooba cleaning robots), conveying a palpable sense of the excitement that comes from solving technical challenges and creating robots that can perform new tasks efficiently. Indeed, Singer also conveys the moments of disconnection between happy-fun engineering work and the serious — or even deadly — uses to which the products of the engineering work are put. It is exhilarating to figure out how to do something you couldn’t do before, and it’s understandable that this exhilaration is not always in contact with the question of how the new capability should or will be used.
The gap is important because robots do not always change things like war in the ways we expect them to. While robots clearing IEDs may save the lives or limbs of the humans who would be doing it otherwise, they also shift us into a mode where one side in a conflict is perceived as taking on more risk (and thus displaying more bravery and commitment to the cause) than the other. Although unmanned drones that can be piloted remotely — from Las Vegas instead of Baghdad or Kabul — may reduce the costs and logistical challenges that come from deploying more personnel overseas, they may also unravel the cohesion between the troops piloting the drones and the troops taking fire on the ground, not to mention exposing the personnel controlling the drones from the safety of Las Vegas to the twin pressures of working a 12 hour shift at war and then going home to a family whose daily demands assume you’re not really at war. The use of lightning-fast communications technologies end up complicating decision-making and subverting command hierarchies. The costs of the robotic units, and their potential value to an opponent who might duplicate or repurpose them, leads the troops whose lives are supposed to be safer because of the robots to risk themselves to protect or retrieve the robots.
How we use robots and the associated technologies doesn’t just fit into our pre-existing plans and goals. It shapes our plans and goals, in part as a response to our sense of what the new technologies make possible, feasible, and desirable. But where we end up may depend quite a lot how many steps down the causal chain we’re willing to look as we integrate new technologies into plans that we regard are more or less stable.
After all, it is not just our plans that will end up changing, but ourselves.
Interspersed with interviews with engineers, military experts and troops of all ranks, and policy wonks are visions of our robotic future from literature, film, and TV. The views that Singer presents from science fiction writers and futurists range from optimism that robots will transform our world to something immeasurably better to certainty that we should prepare ourselves to serve (or battle) our robot overlords. The extent to which science fiction writers and futurists are taken seriously within the military as the military is embracing greater use of autonomous machines may surprise the reader. Possibly, though, it should surprise us more that the rest of society is not taking real steps to think through the shape our robotic future could take.
I myself will not be up nights worrying about the robots’ conquest of humanity. The need for power, regular maintenance, and an operating system that doesn’t crash means that the ‘bots will have an uphill battle (and will likely be reduced to slugging it out with the zombies). However, Singer’s observations on the global demographic shifts that make conflict ever more likely do give me the willies. Technological advances can end up amplifying disparities between the rich and the poor, the consumers and those laboring (or having their natural resources taken on the cheap), while also letting the have-nots get intimate experience of what they do not have via the internet and other means of mass communications. To think that we can easily use technology to neutralize the resulting unrest is the kind of hubris that feels to me like the beginning — not the end — of an apocalyptic movie.
Ultimately, Singer’s book argues that we look to the connection between gee-whiz technologies and human aspirations. Robots cannot solve all the problems inherent in human interaction (whether with the world or with each other), and to the extent that we treat them as if they could, we manage to create new problems to untangle. How do robots fit into the existing international laws of war? What counts as a proper battlefield when weaponry and surveillance are controlled remotely? Who bears responsibility for the harm robots inflict on people and property, whether through proper functioning or malfunctioning — the people controlling them, the people who made the hardware, the people who made the software, or all three?
Indeed, this book raises larger questions about what it is about the human animal that leads us to direct so much of our intelligence, ingenuity, and effort towards fighting each other. Possibly more engagement with the questions Singer raises before the robotics revolution has come to pass will make us examine whether this “fact” of human nature is as immutable as it seems.

facebooktwittergoogle_pluslinkedinmail
Posted in Book review, Doing science for the government, Engineering, Politics, Social issues.

3 Comments

  1. Two themes in the book that resonated with me were (1) Kurtzweil’s forecast of continued exponential increases in the rate of technological advances, and (2) evidence of the seminal change in the attitude of the military command struture toward automated warfare in past 5-8 years. I’m in the industry, though mainly with educational and research robotics that aren’t necessarily focused on military applications, so I see the technology, but the significance of the exponential factor had not necessarily registered before.
    As a footnote, two of the three founders of iRobot (Greiner and Brooks) have left the company in the past 6 months. Not sure what that means, but it’s an interesting data point.

  2. A friend flies RPV’s in Iraq. He is in Iraq, and has some interaction with other soldiers on the ground. I suspect that the nature of warfare is changing so that winning the hearts and minds of the people is increasingly important. Not sure robots can do that.

  3. The problem with any book about the future is that it’s bound to be wrong. Except, maybe, in a few cases where someone accidentally gets something right.
    The idea that anyone might take what a science fiction writer thinks seriously is seriously disturbing. There are a few who actually know something about science, but they are rare enough that you could probably name them on the fingers of one hand. The rest of them believe what they write.
    As for how robotics will change war, I suggest that history provides a good lesson. Take the machine gun, for instance. Take the tank, for instance. Take the railroad, for instance. Or the airplane. Or the aircraft carrier. They all do the same thing: make war more deadly. The supposed precision of our precision weapons just shifts the burden a little more heavily on the guy who pulls the trigger (or launches the autonomous vehicle). It puts the man behind the gun even further from the action, but it’s a process that has been going on for decades. Large-scale aerial bombing in WW II presented exactly the same ethical issues. The men who flew the bombers never even saw the people they killed, although they often died in large numbers themselves.

Leave a Reply

Your email address will not be published. Required fields are marked *