Georgia Tech professor Ronald Arkin contends that we might soon see the advent of autonomous drones operated by algorithmic ‘ethical governors’ replacing human decision in warfare. As Peter W. Singer’s argues in his seminal book Wired for War, this prospect does not belong to the realm of science fiction: we are amidst a revolution in military warfare, with digital and robotic technology increasingly replacing human decision in contemporary warfare.
Singer therefore argues for a paradigm shift not only in our understanding of strategy, but also in ethics. In political practice, this has not occurred. In the Brennan confirmation hearings, the ethics of drones became the subject of debate, but it tended to remain within the confines of practical ethics, emphasising questions about legal permissibility, rules of engagement etc. However, if we take Arkin and Singer seriously and accept the idea that soon autonomously acting drones might become a reality (or real possibility) – we begin to see that this concept presents us not with a practical problem but with a fundamental challenge to the domain ethics itself. This is because an autonomous ‘ethical governor’ would lack judgment, the constituent category underlying the ethical order.
Drones and the (supposed) separation of action from consequence
When analysing the current generation of drones, what first strikes us is the extent to which they geographically separate action from consequence: the pilot acts in Nevada, the consequences occur in Pakistan. Obviously, to a certain extent, this separation underlies all mechanisation of warfare. It inserts ‘the mechanical’ in between the combatant and his adversary. However, the advent of digital communication has drastically widened this gap. Interestingly though, this has not by definition led to a further distancing of the actor from the act: the drone pilot is simultaneously further removed, yet closer to his target than a previous generation of pilots. Despite of all talk about the virtualisation or hyper-reality of contemporary warfare since Desert Storm (Baudrillard), it is exactly the safety presented by physical distance which allows the drone pilot to remain very proximate to his subject.
This is also echoed in interviews with these pilots, who often highlight a sense of (unilateral!) proximity to the subject. They frequently get to follow them for hours or days, seeing them engage with friends and family as well as with the less ‘familial’ affairs for which they presumably are targeted. In other words, although for the drone pilot the physical nexus between action and consequence is all but severed, this is not the case for the deeper nexus between action and judgment. It would therefore be mistaken to characterise the judgment of the drone pilot as ‘virtual’, precisely because the consequences of his act are inextricably linked to them.
The ‘ethical governor’ and the separation of act from judgment
The possible advent of autonomous drones operated by ‘ethical governors’, however, would entail such a separation of action from judgment. The technical details do not concern us too much here, since ultimately the conceptual problem remains the same: a genuinely autonomous ‘ethical governor’ suggests that algorithm generated decision replaces human judgment in the operation of the drone. To put it differently, in a genuinely autonomous drone, the decision to ‘push the red button’ is no longer a human one but the outcome of an algorithmic calculation which yields a particular result. Again, this is hardly science fiction: in the domain of computational finance, algorithms have already partially substituted human decision (and judgment) in the activity of trading.
Indeed, this is what gives the concept of an autonomous drone, operated by an ‘ethical governor’ its appeal. It would replace the fallibility of human judgment by the unsentimental, efficient rationality of technology. As Arkin suggests, what would make the concept of an ‘ethical governor’ attractive is the possibility of programming it to comply with the rules of international and humanitarian law. This would render its decisions not only more accurate but also in accordance with the ethics of international law, bringing ever closer the utopian ideal of an efficient, clean and ethical conduct of war.
But, I suggest, this reasoning rests on a fundamental error. What is at stake is not a question about the relative ethical efficacy of such an ‘ethical governor’ but whether it could be said to act ‘ethically’ at all. The answer must be negative, for such an ‘ethical governor’ would lack precisely the faculty through which the ethical order first emerges: that of judgment.
The burdens of judgment
Here I’m arguing that ethics is premised on the fundamental possibility of judgment. Judgment is not merely a product or outcome of ethics, nor simply compliance with a rule, it constitutes its foundation. Judgment is the particular capacity to discriminate between meaningful alternatives. In ethics, judgment means the ability to discriminate between various alternative actions and simultaneously to bear responsibility for both that judgment and its consequences. This gives all genuinely ethical problems their weight. The weight of ethical judgment arises not only out of the possibility of fallibility, but equally out of the responsibility for a correct judgment. In this sense, even the very concept of callous or erroneous judgment refers back to the underlying connection between responsibility and judgment.
In the case of the autonomous drone, the ultimate decision would be about life or death. One could program its ‘ethical governor’ with a rule that ‘thou shalt only kill if x, y, z’, stipulating all relevant clauses of international and humanitarian law. Consequently, we might suggest that its actions would accord with the ethics of international law. However, here we locate the dangerous logic underlying the very concept of an ‘ethical governor’: that because its actions would be in accordance with the law, they would thereby somehow also become ‘legal’. What gives legal or ethical actions theirsignificance is never merely their accordance with a rule, but the fact they follow from an agent’s conscious judgment to do so. Thus, although nominally, stones and drones might act ‘in accordance with the law’ (e.g. not killing etc.); only human judgment confers it with any legal, or ethical significance. Consequently, any action of an ‘ethical governor’ would strangely be in accordance with – yet remain strictly ‘parallel’ – to the law because it would fail to uphold its legality. Moreover, its acts might be traced back to mechanical and algorithmic causes, but would not refer to any meaningful judgment; nor would they generate responsibility. (The suggestion that this might be mimicked algorithmically would make a travesty of the entire concept).
Towards a right to be killed by human hands?
One might argue that if the outcome is superior to human decision, why bother? Alternatively one might suggest that the ultimate responsibility for the ‘ethical governor’ still relies with its owners. Here I suggest that either response would fail to capture the importance of judgment for ethics. Outcomes matter ethically, fundamentally so because they refer to the concept of responsibility. This only becomes possible with judgment. Moreover, the engineers of an ‘ethical governor’ would presumably bear some responsibility; but for what exactly? Would they not merely provide the algorithmic possibilities for action, rather than pass actual judgment in the relevant circumstances? Quite literally, in this minor shift, the adversary has become a data point, rather than a subject ‘worthy’ of judgment. His death would be the consequences of an algorithmic query yielding a particular result, rather than of judgment, regardless how terrible.
Thus, the near advent of autonomous drones operated by ‘ethical governors’ would not solve any ethical problem, it would dissolve a constitutive component of ethics itself: judgment. We should not be surprised if in the near future – perversely – the ‘right’ to be killed by human hands will become a legitimate subject of debate.