Why AI cannot prevent dehumanization in drone warfare

Autumn Perkey

International Affairs
International Affairs Blog
4 min readAug 5, 2021

--

A predator drone on display, 10 March 2010. Photo: Doctress Neutopia via Flickr

Debates about the role of artificial intelligence in drone warfare have taken on a new urgency of late. Since the first battlefield killing by an autonomous drone in Libya during May 2020, questions about the violent impacts of AI on soldiers and civilians alike are no longer hypothetical. It is therefore vital to examine the impact of the ongoing automation of warfare and its potential future effects on those living through it.

While unmanned aerial vehicles (UAV) and artificial intelligence (AI) may decrease the costs to conflict, they also signal a change in the nature of dehumanization in warfare, which brings moral and psychological costs of its own. UAVs reduce the risk of military fatalities and are more accurate than ground strikes. However, decisions made in targeting and evaluation of UAV strikes also raise important questions about humanity in war. Seeing someone through a scope from miles away, it turns out, is not the same as seeing them face to face.

The recent history of drones and dehumanization

To properly understand the likely impacts of increasing uses of AI in drone warfare, it is crucial to understand the contemporary impacts of drone warfare on combatants, civilians and the often-opaque distinctions between the two. US drone policy under the Obama administration defined combatant as any combat-age males in theatre, asserting that if you were a combat-age male you were labeled a combatant regardless of the actuality. The desire to achieve military targets has historically resulted in civilian collateral damage — a task that often falls to drone operators. Examples of such collateral damage being inflicted by drone strikes include a strike in Khost from 2015 which resulted in 14 civilian deaths and an accidental strike that hit Javari village in Afghanistan, causing a further 30 deaths. In this context, the phenomenon of dehumanization has become increasingly prevalent.

Dehumanization refers to applying alien or inhumane characteristics to others to justify heinous acts, such as the targeting of civilians and the use of torture. For drone operators each mission hinges on weighing up the value of a human life, a choice increasingly placed on lower-level operators. In order to get through the day, drone operators are almost incentivized to dehumanize their targets, but little research has been done on this effect. This does not imply that dehumanization is right, but that it functions similarly to the disassociation that survivors of trauma often face to make it through the horrific events they have suffered. It is an ugly necessity of the job, compared by some to participating in prolonged torture. One opportunity to reduce the emotional and psychological burden placed on drone operators could be the incorporation of AI into the target selection process. If drone operators have had issues discerning the differences between civilians and combatants, can we expect better from AI?

Is AI a solution?

The incorporation of AI into UAVs allows for a potential reduction in the negative effects and trauma experienced by military personnel. The argument could be made that removing the effects of trauma might result in more attacks on civilians and that the positive effects are limited to reducing damage to military forces. While it is unlikely that the risk to civilians in combat zones will be eliminated by using AI, it could reduce the effects on military personnel by removing the dehumanization that occurs from making decisions at a distance. The dehumanization removed would only be that experienced by the drone operator, dehumanization on the field would still exist.

The question that the future will answer is if the algorithms that AI depend on will be adequate to make this determination. There are discernable difference in the characteristics between a Humvee and a school bus, and even Apple hasn’t managed to figure out the facial recognition bias problem. AI often struggle with distinguishing between individual facial features and this failure potentially leads to further issues of structural discrimination via facial biases. Findings have shown that AI’s face identifications algorithms had errors of 1% for white men, but 35% for women of colour. The failure of AI to accurately determine differences between demographic characteristics can lead to the improper targeting of civilians who are labeled as combatants due to false positives created by this facial recognition bias. Indeed, when incorporating AI into drone warfare there is no way to fully know the risks until such technologies are deployed in the field on a regular basis. Human operators have been shown to be prone to error that has led to collateral damage and the same is likely to occur with the incorporation of AI.

The experiences of the Obama and Trump administrations using drones as part of a counterterrorism strategy hold important warnings for the Biden administration. Drones have raised key questions of what constitutes a valuable target, who should be responsible for distinguishing between military targets and noncombatants, how they make that distinction, and what is an acceptable level of civilian casualties or harm to achieve an objective. Artificial intelligence may help operators to answer some of these questions but is unlikely to prevent the dehumanizing tendencies of drone systems.

Autumn Perkey is a doctoral student at the University of Maryland, College Park.

This blogpost is part of a collaboration between International Affairs and the Future Strategy Forum (FSF). FSF is an organization and annual conference series that seeks to elevate women’s expertise in national security, build mentorship, and connect graduate students to policymakers.

All views expressed are individual not institutional.

--

--

International Affairs
International Affairs Blog

Celebrating 100+ years as a leading journal of international relations. Follow for analysis on the latest global issues. Subscribe at http://cht.hm/2iztRyb.