Last week, the Federal Aviation Administration announced that a slew of applicants were in the running to host six new domestic test sites for unmanned aircraft, the surveillance and assault machines better known as drones. Applicants came from 37 states, including South Carolina, and speculation began almost immediately about who the applicants were and how the test sites could boost the economies of host states.
But the City Paper took the news as a chance to step back and consider the ethics of drone use. The UK-based Bureau of Investigative Journalism estimates that as many as 4,700 people have been killed by U.S. drone strikes so far in Pakistan, Yemen, and Somalia, and here in the Land of the Free, police departments have been using spy drones since the first Predator drone-assisted arrest took place in North Dakota in December 2011.
Conspiracy-theorizing aside, how should we feel about our country's role in the expansion of drone technology? For an ethical treatment, we turned to D. Keith Shurtleff, a former Army chaplain who served in Afghanistan and spent part of his career at Fort Jackson in Columbia. Back in 2002, Shurtleff published an essay titled "The Effects of Technology on Our Humanity" that focused largely on smart bombs but served as an almost prescient primer on the moral pitfalls of drones. The thesis of his essay was that "as war becomes safer and easier, as soldiers are removed from the horrors of war and see the enemy not as humans but as blips on a screen, there is a very real danger of losing the deterrent that such horrors provide." Author P.W. Singer quoted the essay in his bestselling 2009 book Wired For War: The Robotics Revolution and Conflict in the 21st Century, and it even gets a citation on the Wikipedia page for "Unmanned combat air vehicle." Shurtleff is neither a Luddite nor a pacifist, but he does see cause for concern. We tracked him down in northeast Kansas, where he is retired from the military and working a civilian job.
CITY PAPER: I'll play the devil's advocate: Doesn't detachment from the emotional toll of killing make for a better soldier?
D. Keith Shurtleff: I don't think so. From my perspective, if you think a better soldier is a killing machine — and I'm sure some do, like, "I'll be more proficient, more lethal because I'm not shackled by morality" — to me, that is not a good soldier. That's a killing machine, and that's not something that you want to unleash. That's why you always want to preserve the human side ... And that's why, as I think about future systems, if they become totally AI, artificial intelligence-driven, there's no morality. You can't infuse morality into a system like that. Once war becomes so clean and so efficient and morality doesn't play into it, there's nothing to stop it. For my idea, the ideal soldier never was that. That's why I suggest in my paper, if the guys do the Highway of Death [a strip of road where U.S. forces bombed retreating Iraqis during the Gulf War, with one pilot remarking that it was "like shooting fish in a barrel"], they need to clean it up. They need to pull the wallet from the dead guy and see his family, and it needs to be horrible like that so that they realize. I realize other people disagree with me, but my theory was, you can still be a good soldier because you're fighting for a higher principle. You're fighting for the guy next to you, you're fighting for duty, you're fighting for love of country. And most of the soldiers I knew were fighting for their families back home, because they honestly believed deep in their heart that what they were doing was preparing a better world or a safer world for their children or the people that they cared about. And you never want to take that out of the soldier, from my perspective. I believe soldiers should kill for duty who kill reluctantly ... It should bother them. I guess my theory was that killing should always be a hard and painful thing.
CP: In your paper, you suggested something I'd never heard suggested before. You wrote that "artillery crewmen could be asked to do a rotation at a POW camp or a refugee camp. Pilots who bomb cities could, after the cities are secure, be required to return to the city and work to rebuild some part of it." What sort of reaction did you get from people within the military after you wrote that?
DKS: Some understood the deeper principle that, yeah, we want to keep humanity, but I heard from others who honestly believed that would return what they thought the conditioning had removed, and it would make the soldiers less effective or less willing to act ... What I used to teach the soldiers in ethics, obviously there's the moral and, for me, the deeper religious reasons why we should maintain our humanity, but also there's what's called winning the hearts and minds, which is a strategy. And so I would try, in case my soldiers didn't really buy the humanity part, I'd say, "So now I'll just show you how it can also be effective to maintain the hearts and minds. You still can't be like a killing machine. You still have to exercise compassion" ... I found out teaching ethics that we're all at different ethical levels. Some people are just, "Well, what's in it for me?" And if that's their level, then you have to find some reason other than duty or honor or love, which I always argued were the higher motivations.
CP: Is there an equivalent to your suggestions for military personnel that could be applied to police forces using drones domestically?
DKS: Police forces, I could see that, because they would be under authority, and you could make it part of their training. I'm not sure if police forces do ethical training, which opens the whole other issue that there's an ongoing debate about whether you can even train ethics or not ... So I believe ethics training can have an impact. It's probably not the cure-all, though.
CP: Should we be pushing back against the expansion of drone usage, abroad and domestically, or should we accept this as an inevitability and focus on how to use those tools ethically?
DKS: I know stuff is going to go forward and capitalism and money-making is going to drive the train, but at least someone should stand up and say, "Hey, you guys can do this, but maybe you should pause and consider what you're going to lose." ... I was talking to my son about it, and I was explaining to him about the old book by George Orwell, 1984, and I couldn't remember, but as I was talking to him, I think it was television, everyone was addicted to television, and that's how [the government] would spy on you. It's been a long time since I read it, but I told him, what if drones are like that? What if they're the new television? What if they become so invasive that the government knows everything you're doing at every moment? ... So, if drones become so prolific, how can you control it? Do we really want to give up the privacy? Do we really want to never know if someone's up there casting down spying on you? And that goes back to the search and seizure issues and all those other things. There should be some kind of ethical control on it. In the military, it's a little bit easier, and that's why I made recommendations. For the government, maybe if the government is considering that, they should actually consider: What do we lose on this, and how can we regulate it?