Arms Race Dynamics of Lethal Autonomous Weapons Systems

Ellie Harris
Warwick Artificial Intelligence
7 min readFeb 23, 2022

--

The development of Lethal Autonomous Weapons Systems (LAWS) brings a whole new world of capabilities in weaponry — but can these systems positively benefit the ethics of lethal decisions made by militaries and what will be their impact on the escalation of the already spiralling arms race between governments?

Why does the military need LAWS?

Militaries worldwide are increasingly understanding the importance of LAWS due to the technology being drastically more efficient than man-powered weaponry. For example, if there were to be individually piloted drones launched for an attack, there could only be a couple in the air due to the risk of collisions and the pilot will be the one calling the shots on when and where to fire, with a high potential for human error. With AI in control, complex pathfinding and decision making algorithms can make these problems disappear (Piper, 2019).

The use of AI in warfare equips us with an unparalleled level of accuracy allowing us to limit the severity and range of these attacks. This is evidenced as technologies are beginning to surface equipped with significantly less firepower than their predecessors, our source stating “Nations are using hellfires to take out solitary vehicles that have a kill radius of 200+ metres. But these LAWS use maybe a kilo or two of explosives”. This means that attacks can be achieved on as small a level as a single car without risking civilian lives or damaging surrounding infrastructure.

UK in the Arms Race

With the growing popularity of these autonomous weapons, we have entered into a new era of warfare. As much as it may seem that the UK wishes to remain detached from these cutting-edge advancements, the nation is taking a discreet interest in the technology. Although the exact details of the UKs involvement in the development of LAWS are unknown, we can be sure that the development is undertaken in an attempt to remain competitive against increasingly well equipped adversaries. Our source shed some light on this position: “what keeps you alive — you want. What gives you a vital edge over an enemy — that’s invaluable to us’’. However, this stance is a conflicting one; are we increasing the pace of development of these systems, whilst not being fully in full support of what they stand for?

Ethical issues surrounding LAWS

The key area of contention surrounding LAWS can be divided into 3; the alignment of these systems, and the danger of misuse, and the problem of blame.

Alignment refers to ensuring that an artificial intelligence system is working towards the goal in the way that it was intended (Christiano, 2018). This is especially important due to the weight of these decisions meaning life or death of innocent civilians caught in the crossfire if the AI were to be misaligned. Misprocessing of visual data (targets or environmental stimuli) can be lethal and even something as obvious as miscalculating the risk vs. reward of eliminating the target whilst causing other potential harm needs to be seriously considered.

Misuse is a key issue due to the terrible consequences of if this technology were to get into the wrong hands. This could be people intent on doing extreme and unnecessary harm but this claim also applies to people that haven’t had the integral and necessary training and so are unequipped to handle LAWS and even comprehend the danger of misuse.

The problem of blame is philosophically interesting with the question posed of who would take the fall if something were to go wrong. Would it be the developers of the weapon, the people who authorised the attack or even higher up with the government being responsible? All of these have their individual arguments for and against and a satisfying conclusion has not yet been reached, causing a slight lax in this area of the legality of the systems.

What does the UK think of LAWS?

It is the consideration of these complex ethical problems that our source claims are key to our hesitancy towards LAWS; “We’re not behind. The area that stops us is the moral dilemma of developing these systems, whereas other countries are very happy to go and eliminate civilians if they need to. We are not.”

Debates into whether LAWS should be developed in their entirety are becoming complex, with not only concerned experts getting involved but also — where it matters most — within governments. The British government showed interest in this debate, starting in March 2013 where in the British Parliament House of Lords, Lord Astor of Hever (Parliamentary Under Secretary of State, Defence; Conservative) stated: “Fully autonomous systems rely on a certain level of artificial intelligence for making high-level decisions from a very complex environmental input, the result of which might not be fully predictable at a very detailed level. However, let us be absolutely clear that the operation of weapons systems will always be under human control.” (Parliament.uk, 2013).

Despite this, there has been little discussion on how to achieve their aims and most of these points are directed towards fully autonomous weapons that aren’t being as widely implemented, not discussing the types of LAWS we are seeing currently. Article 36, a United Kingdom-based NGO — suggests a clearer national policy on LAWS. They believe the UK should aim to be at the forefront of the discussions around the weapons, developing on these ideas internationally and objecting to the use of lethal autonomous weapons systems. The NGO also proposes that the nation support negotiations surrounding international law to abolish the technology or at least strongly regulate their use. They state that the UK should specifically detail how they would make certain that their systems remain under human control on current and developing systems. Further, Article 36 makes clear that an assessment on the effects of ensuring meaningful human control should be carried out, as well as an inquiry into how we would match up to other countries and how it would affect each individual attack.

LAWS around the world

Many other countries are being less restrained with their involvement with Lethal Autonomous Weapons. China is just one example of this with their ‘Next Generation Artificial Intelligence Development Plan’ that details plans to utilise AI on the battlefield. The United States is also getting involved with autonomous technology in warfare with developments such as the Sea Hunter — an unmanned submarine tracking vessel capable of detecting mines and other submarine vessels (Klare, 2019). The US claims that they do not have any LAWS in their inventory however this is unlikely to remain the case in the immediate future. Military leaders in the US have stated that they may be compelled to develop the technology — with observation of LAWS becoming more widely implemented in militaries — but only if they remain under human supervision when making decisions (Sayler and Congressional Research SVC, 2020).

Russia is an avid supporter of Lethal Autonomous weapons in their entirety, with Putin declaring that the country dominating this technology will ‘become the ruler of the world’ (Haner and Garcia, 2019). Despite this interest, they have invested very little in the weapons and as such have a reportedly low supply. Another country in support of fully autonomous weapons systems is South Korea, a world leader in sentry weapons with the implementation of the Samsung SGR-A1 — a turret capable of surveillance, tracking, firing and even voice recognition. The European Union is divided in its opinion on weapons with a combined total of $8 billion devoted to weapons across the countries. Many countries in the EU are unwilling to develop these lethal weapons, meaning that the main focus of the group is not on LAWS but instead on AI in industrial systems robotics, which could be seen as one positive alternative, for now.

Our interviewee informed us that most of the testing of these systems is being run in Libya “[Libya] is basically being used as a playground to test weapons systems”. This is due to the complete breakdown of the Libyan regulatory and legal systems that have resulted in an almost total free-for-all among western military forces. Although this testing is important to ensure that LAWS are working effectively and safely, it is largely unlegislated and undocumented. This issue — if and how we are going to regulate these systems — is beginning to rear its head within the media and is the focus of the next essay in this series.

References/Resources

Article 36 (2016). The United Kingdom and lethal autonomous weapons systems. [online] Available at: https://article36.org/wp-content/uploads/2016/04/UK-and-LAWS.pdf.

Christiano, P. (2018). Clarifying “AI alignment” — AI Alignment. [online] Medium. Available at: https://ai-alignment.com/clarifying-ai-alignment-cec47cd69dd6 .

Haner, J. and Garcia, D. (2019). The Artificial Intelligence Arms Race: Trends and World Leaders in Autonomous Weapons Development. Global Policy, [online] 10(3), pp.331–337. Available at: https://onlinelibrary.wiley.com/doi/pdf/10.1111/1758-5899.12713.

Klare, M.T. (2019). Autonomous Weapons Systems and the Laws of War | Arms Control Association. [online] Arms Control Association. Available at: https://www.armscontrol.org/act/2019-03/features/autonomous-weapons-systems-laws-war.

‌Parliament.uk. (2013). Lords Hansard text for 26 Mar 201326 Mar 2013 (pt 0001). [online] Available at: https://publications.parliament.uk/pa/ld201213/ldhansrd/text/130326-0001.htm#st_14.

Piper, K. (2019). Death by algorithm: the age of killer robots is closer than you think. [online] Vox. Available at: https://www.vox.com/2019/6/21/18691459/killer-robots-lethal-autonomous-weapons-ai-war.

‌ Sayler, K.M. and Congressional Research SVC (2020). Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems. [online] DTIC. Available at: https://apps.dtic.mil/sti/citations/AD1121848.

Turner, J. (2018). Sea Hunter: inside the US Navy’s autonomous submarine tracking vessel. [online] Naval Technology. Available at: https://www.naval-technology.com/features/sea-hunter-inside-us-navys-autonomous-submarine-tracking-vessel/

--

--