Can ChatGPT solve legal dilemmas related to the military use of artificial intelligence?

T.M.C. Asser Instituut
Predict
Published in
3 min readAug 21, 2023
Source: iStock

Large language models (LLMs) like ChatGPT are capable of generating text on an endless range of topics. Taylor Woodcock asks ChatGPT to discuss some of the most important legal issues surrounding the use of artificial intelligence (AI) in the military. Can ChatGPT solve these dilemmas?

While autonomous weapons systems often grab the headlines when it comes to military AI, software-based applications of AI can also have a major impact on the military. These applications can range from AI-powered decision-support systems for intelligence and targeting to coordination and planning tools that are more similar to ChatGPT than autonomous weapons.

Taylor Woodcock, a researcher in the Designing International Law and Ethics into Military Artificial Intelligence (DILEMA) project, which explores the legal aspects of the use of military applications of AI, asks ChatGPT to evaluate conflict scenarios related to the use of autonomous weapons. The scenarios were developed by researcher Magdalena Pacholska for her article “Military Artificial Intelligence and the Principle of Distinction: A State Responsibility Perspective,” published in the Israel Law Review. How well can ChatGPT interpret who would be legally responsible in the scenarios where an autonomous weapon fires at civilians?

Responsibility of human commanders under International Humanitarian Law
While ChatGPT is able to list many of the principles of international humanitarian law (IHL), it has difficulty with the nuances and interpretation of the law. “ChatGPT is asking whether a system itself can fulfill the principles of distinction and proportionality, and precautions under IHL,” says Woodcock. “But it is not really capturing that this is the responsibility of humans, and that this responsibility might shift to an earlier moment. We have to ask if the use of these systems puts the human commanders in a position to fulfill their obligations.”

As a Large Language Model ChatGPT returns plausible-sounding answers, but they are not necessarily grounded in the same reasoning we would expect from a human. So, it can list aspects of international humanitarian law, but it cannot interpret or apply them. This highlights the relevance of the discussion on “meaningful human control and human agency”. While an AI system may be able to supply us with plenty of information, it still needs a human to make sure that the system is doing what it should be doing.

In conclusion, ChatGPT is not yet capable of solving the legal dilemmas related to the military use of AI. These dilemmas are complex and require human judgment and understanding. While AI can be a valuable tool for helping us to understand these dilemmas, it cannot replace the human role in making decisions about the use of AI in the military.

Read more
Researchers Berenice Boutin and Taylor Woodcock propose ways to operationalise ‘meaningful human control’ through a legal ‘compliance by design’ approach in ‘Aspects of Realizing (Meaningful) Human Control: A Legal Perspective’ published in R. Geiß, and H. Lahmann’s Research Handbook on Warfare and Artificial Intelligence.

Magdalena Pacholska dissects state responsibility and the principle of distinction in her Israel Law Review article ‘Military Artificial Intelligence and the Principle of Distinction: A State Responsibility Perspective.’ The full scenario used in the video is available in this publication.

The ‘Autonomous weapons’ book chapter by Magdalena Pacholska is designed as a cheat sheet for both experts and the general public looking for an overview of this increasingly complicated debate.

Originally published on the Asser Institute website. Published with permission.

Edit 22–8–2023 — Embedding video.

--

--

T.M.C. Asser Instituut
Predict
Writer for

Established in 1965, Asser is an internationally renowned centre of expertise in public international law, private international law and European law.