Armed Drones a dangerous precursor to Autonomous Weapon Systems

Freya Tulloch
nonviolenceny
Published in
4 min readNov 8, 2018

Heather Linebaugh served as an imagery and geospatial analyst for the US drone program during the occupations of Iraq and Afghanistan from 2009 to 2012. She wrote an article in 2013 exposing the overstated capabilities of drone technology by the US and the UK as well as the psychological trauma which has affected drone operators. She speaks of a time when, due to how pixelated the video feed was, it was almost impossible to distinguish between whether the target was holding a shovel or a weapon [1]. In this circumstance, the drone was human operated and Linebaugh was still exercising control over the targeted attack, but what will happen when artificial intelligence allows drones to be operated entirely out-of-the-loop (with no human operator)?

The problem of autonomous weapons systems…

Emerging means of warfare and new weapons systems were highlighted in the Secretary General’s Agenda for Disarmament as an issue requiring urgent international attention. He stated that the pace of technological developments is challenging the capacity of governments and international regulatory frameworks to respond [2]. Drones have acted as a precursor to the emergence of highly autonomous weapons systems (AWS). Such weapons raise a number of ethical and humanitarian concerns. http://bit.ly/2O6vcPV

Autonomous Weapons Systems in an international framework…

Exponential advancements in military technology have led to a situation where technology is advancing at a rate faster than humans’ ability to learn, adapt and regulate. While only a handful of states have domestic legislation which regulates the development of AWS, there is currently no international agreement on their definition, nor a treaty restricting development. The International Committee of the Red Cross Meeting on Autonomous Weapons Systems defined AWS as, “weapons that can independently select and attack targets” [3]. They possess functions which allow them to identify a target and deploy lethal force without any human supervision [4].

Major Concerns..

This blog will unpack the following major concerns:

  • Lower the threshold for engagement
  • Reciprocity of Risk

Lowering the threshold for engagement…

AWS could lower the threshold for states’ engagement in conflicts by removing soldiers from the battlefield. This would enable politicians to more easily justify conducting missions on the basis that the loss of an AWS would not cause the same political backlash as the loss of a human life. Polls in the US have indicated that, “as casualties mount, support for war wanes”[5]. For example, a Gallup October 2011 Poll, indicated that 75% of respondents approved Obama’s decision to withdraw US troops from Iraq, suggesting that the American Public “ha[s] become increasingly averse to any casualties” [6]. AWS signify the “ultimate break” between the public and its military. A combination of “public passivity” towards war, and the significantly reduced human costs of engagement, will make the rationale for US engagement less onerous or considered, particularly so, under the Trump administration [7].

http://bit.ly/2DZxHyZ

The ‘reciprocal risk’ issue…

Another important consideration when unpacking the ethics of AWS, is what effect the shift of a soldier on a battlefield in Afghanistan to an operator in a cubicle in Nevada will have on traditional concepts of warfare and the soldier. By removing the soldier from the risks of warfare, the asymmetry between the capabilities of actors is increased. Law professor Paul Khan has termed the issue as “reciprocal risk” between actors. He argues that as drones have increasingly become the weapon of choice for the US in its pursuit of the Global War on Terror, it has removed pre-existing ideas of time, place and traditional notions of the combatant [8]. This is problematic; as these weapons become more advanced, they will have the capability to enter into urban populated areas, identify and lethally target individuals. This is an incredibly worrying thought when thinking about international humanitarian law and notions of distinction, proportionality and accountability. Not to mention the inability of the target to surrender or even be aware that he/she is in a situation of armed conflict.

This blog has introduced the ethical and humanitarian issues posed by autonomous weapons systems. Future blogs will expand upon artificial intelligence and what role it is playing in advancing weapons and what issues they pose. It is time we start addressing the ethical issues associated with these weapons and campaign for an international agreement which regulates their development.

Freya Tulloch October 4th 2018

References:

[1] Linebaugh, Heather. “I worked on the US drone program. The public should know what really goes on,” The Guardian, December 29, 2013. <https://www.theguardian.com/commentisfree/2013/dec/29/drones-us-military?CMP=fb_gu>

[2] United Nations Office for Disarmament Affairs. Securing our Common Future: An Agenda for Disarmament. New York: United Nations, 2018.

[3] International Committee for the Red Cross. “The ICRC Expert Meeting on ‘Autonomous weapon systems: technical, military, legal and humanitarian aspects’. 26–28 March 2014, Geneva, 9 May 2014, 2.

[4] Piccone, Ted. “How can international law regulate autonomous weapons?” Brookings Institute, April 10, 2018. Accessed online <https://www.brookings.edu/blog/order-from-chaos/2018/04/10/how-can-international-law-regulate-autonomous-weapons/>

[5] Horowitz, Michael C. and Matthew S. Levendusky. “ Drafting Support for War: Conscription and Mass Support for Warfare.”The Journal of Politics 73, no. 2 (2011): 525.

[6] “Iraq,” Gaullup, last accessed 16 May, 2017. http://www.gallup.com/poll/1633/iraq.aspx. ; Merchant et al., “International Governance of Autonomous Military Robots,” 14.

[7] Singer, Peter W. Wired for War : The Robotics Revolution and Conflict in the Twenty-first Century. New York: Penguin Press, 2009, 320.

[8] Khan, Paul. “Imagining Warfare.” The European Journal of International Law 24. no1 (2013):19.

--

--