Lethal Autonomous Weapons Systems and Human Rights

This is a draft of an abstract for Research in Human Rights.

carro
underprepared undergraduate

--

Reports by the Human Rights Watch, widely considered a credible source on issues of human rights, often urge countries to correct their own rights violations. However, a more recent report calls for a preemptive ban on “killer robots,” or lethal autonomous weapons systems (LAWS). The text “No country is safe” accompanies an animation of the White House being bombarded by drones. The website depicts an image of a child holding a teddy bear as the sky turns red and drones fly towards him. It says, “As machines, they would lack the inherently human characteristics such as compassion that are necessary to make complex ethical choices.” This paper begins with an interrogation of this belief: that violence greenlit by humans is permissible, but violence lacking human control is amoral. This view assumes a neutral and autonomous liberal subject, but underlying that presumption is a racialized and gendered history of technoliberalism, defining some humans as moral in opposition to the amoral or apathetic other. Furthermore, we explore distinctions between “humane” and “inhumane” violence that mark some killings as permissible. Given the historical nature of “humanity” as something that can be given or taken away, the implications of the call to ban lethal autonomous weapons systems reify what authors Neda Atanasoski and Kalinda Vora call the “surrogate humanity” effect — wherein the nature of human rights as based in empathy is a…

--

--