Changing the LAW(S) of War: The Normative Implications of Autonomous Weapons Systems

By James Fleming, Queen’s University

Image Credit: BBNTimes

Lethal Autonomous Weapons Systems (LAWS) are at the forefront of cutting-edge military technology and the full realization of the power that these systems possess has not yet been reached. The question remains; will we ever realize the full military potential of these systems? While investigating this issue, there is no straightforward answer. There are many considerations to be made as nations look to upgrade their existing arsenal and integrate LAWS into all three levels of war. Such considerations include the removal of the ‘human factor’ from warfare, the legality of using LAWS to conduct war, and the normative implications of deploying “killer robots” onto the battlefield. Currently, there are very few international laws surrounding the deployment of LAWS on the battlefield and only a select few nation-states are actively developing and deploying LAWS. The incentive to develop LAWS is high as deployment may foster major power imbalances with the scales tipped in the favour of countries with high levels of technological advancement. Autonomous technology has the potential to reshape the fundamental understanding of the concept of war and change how wars are fought; forever.

Machines are already beginning to replace humans on the battlefield to create distance between the soldier and the target; this is evidenced by the deployment of the U.S. Reaper and Predator drones. What differentiates these unmanned aerial vehicles (UAVs) from LAWS is the addition of state-of-the-art artificial intelligence (AI) technology. This allows LAWS to identify, select, and kill human targets without the need for human intervention. Although there has been no formal introduction of LAWS, this technology is well beyond the concept stage. A UN report conducted in 2021 suggested that members of the Libyan National Army were hunted down and killed by a Turkish-made Kargu-2. This report was met with fracas and sensationalism in the media with comparisons being drawn to the Terminator, suggesting that LAWS and the Kargu-2 in question can operate with human-like intelligence. Military AI technology is not at the human-like stage yet, but the strike and subsequent questions about the “Man-in-the-Loop” principle being broken may serve as a forecast for what is to come.

By purportedly breaking the Man-in-the-Loop principle, Turkey has demonstrated how LAWS can operate with their own independent agency; without the ‘human factor’. Removing the human from the cycle greatly increases the speed and efficiency of the LAWS as the system can almost instantaneously approve the kill chain that governs its programming. Although, with great efficiency comes great responsibility and programming LAWS to successfully differentiate non-combatants from combatants is incredibly difficult. LAWS can only interpret information that has been programmed into its database and if certain elements have been overlooked in the programming stage, it hinders the flexibility of LAWS in combat situations. Undefined and complex combat zones may inhibit the system’s ability to compute the consequences of engagement and in urban environments, the system has a high chance of injuring non-combatants. As experienced in Afghanistan, there is an increasing reliance on guerrilla warfare tactics which means LAWS could not currently be deployed in complex urban environments. This greatly reduces the scope in which LAWS can be used and calls for the usage of the “old school” boots-on-the-ground approach.

Classical warfare is at odds with the introduction of LAWS and presents a conundrum for LAWS as the systems are touted to save more lives than they take on the battlefield. Early hypotheses suggest that LAWS can ultimately be used for support on the battlefield and propose that autonomous vehicles can be used for high-risk medical evacuations to increase the survivability of assets on the ground. Proponents for the use of LAWS also cite the system’s ability to kill as a benefit of its deployment. The efficiency of LAWS could cut costs for the military as one system could be equipped with the capabilities of about 100 soldiers. Additionally, research suggest that a soldier’s cognitive abilities can breakdown when overloaded with stress and can disrupt their thought process. Removing the ‘human factor’ from combat would provide a solution to streamline warfare and slow the flow of casualties. Finally, advocates and literature surrounding LAWS theorize that these systems are simply the third evolution of warfare and that LAWS are an inevitable evolution of warfare.

As LAWS and AI technology evolve side-by-side, the laws surrounding their use and application in combat are evolving at a frustratingly slow pace. Philip Alston, a former UN Special Rapporteur on extrajudicial, summary, or arbitrary executions raises the concern that the lethal capacity of LAWS has been left largely un-explored by human rights and humanitarian actors. The lag between the evolution of technology and humanitarian law can be fatal as countries may look to abuse grey-areas or loopholes when actively deploying LAWS into combat. Even if laws are established, how does one punish a robot? It is impossible to charge a system for a crime; a system lacks a moral compass and will not feel the effects of prolonged isolation in a cell. The next option would be to punish a human for the crime, but it is a matter of who; would it be the programmer, an officer, or is a ‘fall guy’ miraculously introduced to reduce political damages. There is no real answer as the system operates with its own agency and decision making but does not have the capacity to deal with the consequences.

Warfare and the normative framework surrounding modern combat are destined to change as technology continues to evolve. In mid-2000 the U.S. Senate Armed Services Committee was looking ahead to the eventual introduction of LAWS and set two ambitious goals. Within ten years one-third of all deep-strike aircraft would be unmanned; and within 15 years, one-third of ground combat vehicles would operate without human beings on board. For better or for worse, the world will feel the impact of LAWS and as evidenced by the Turkish incident, the formal introduction of LAWS is closer than we think.

James Fleming is a BAH student at Queen’s University majoring in Political Studies. He has a keen interest in international affairs and the intersection between military and technology.

--

--

Centre for International and Defence Policy
Contact Report

The CIDP is part of the School of Policy Studies at Queen’s University and is one of Canada’s most active research centres on international security.