No, autonomous weapon systems are not unlawful under the Martens Clause

The Martens Clause, Art.1(2), Additional Protocol 1 to the Geneva Conventions

TL;DR: I disagree with the morality-based conception of ‘humanity’ that Human Rights Watch use in their most recent report, and do not believe that autonomous weapons go against the legal conception. There are no defined levels for how much public opposition is required to outlaw a weapon under the Martens Clause, so we cannot say that current public opposition is enough. Thus, autonomous weapons are not prohibited by the Martens Clause.

This morning I had several messages from friends telling me that there was a new report out on killer robots that I should read. The report is ‘Heed the Call’ by Human Rights Watch (HRW). This is promoted as showing how ‘Killer Robots Fail Key Moral, Legal Test’ and has been on twitter apparently showing that ‘Lethal autonomous weapons violate a century-old clause in international law’.

I happened to be doing some work on how the Martens Clause applies to autonomous weapon systems (AWS) a few weeks ago, and I came to the exact opposite conclusion. Let’s dig into what this is all about.

The Martens Clause, named after its creator F.F. Martens, was drafted at the 1899 Hague Peace Conference which led to the 1899 Hague Convention. It was created as a compromise measure between disagreeing states as to whether inhabitants of occupied territories could be regarded as lawful combatants if they took up arms against the occupying force (Greenwood pg.129, in Fleck 1995). The clause appeared in the preamble to the second 1899 Hague Convention and has been included, and slightly updated, in the 1907 Hague Conventions (IV, preamble), 1949 Geneva Conventions (Art.64, GC I; Art.62, GC II; Art.142, GC III; Art.158, GC IV), and the 1977 Additional Protocols to the Geneva Conventions (API, Art.1(2); APII, Preamble).

The version mostly referred to in modern international law is that from API:

In cases not covered by this Protocol or by other international agreements, civilians and combatants remain under the protection and authority of the principles of international law derived from established custom, from the principles of humanity and from the dictates of public conscience.’

The more minimalist version from APII is rarely mentioned:

‘…in cases not covered by the law in force, the human person remains under the protection of the principles of humanity and the dictates of the public conscience.’

Still, they both mention the two key tenants of the rule: that in circumstances not covered by specific treaty law, then international law will determine the legality of the issue and in these cases the rules to be applied will come from customary law, the principles of humanity, and the dictates of public conscience.

First, let us consider customary law. For any non-international lawyers reading this, customary law is basically the rules which aren’t in treaties and states make up as they go along. It is made up of 2 parts: State practice, and opinio juris. For a rule of custom to form, ‘[n]ot only must the acts concerned amount to a settled practice, but they must also be such, or be carried out in such a way, as to be evidence of a belief that this practice is rendered obligatory by the existence of a rule of law requiring it.’(North Sea Continental Shelf Case, 1969, para.77). As technology has only recently reached a level where some systems could be considered fully autonomous (e.g. the Israeli Harpy/Harop), there has been no chance for settled state practice, nor opinio juris to form. Thus, there cannot be any customary rules preventing the existence or use of autonomous weapons. Furthermore, many understandings of what an AWS actually is would not include the Harpy drone, and so in these conceptions customary rules are even further away.

The IAI Harop UAV. Photograph taken at: Paris Air Show, 19th June 2013 by Julian Herzog

This leaves us with the principles of humanity, and the dictates of public conscience. To most people, these sound like moral or ethical concepts, and indeed this is how they are portrayed in HRW’s Heed the Call (pgs.19–22, 28–43). The problem is that the Martens Clause comes from legal documents, is a legal concept, and in order to be applied it needs to be defined legally. Indeed, Martens himself attempted to legally define ‘humanity’ (Cassesse, pg.188–189), and the concept has previously been referred to as the ‘laws of humanity’(Ticehurst, 129).

HRW suggest that ‘The first principle of humanity requires the humane treatment of others’(pg.19) and that in order to do this, ‘one must exercise compassion and make legal and ethical judgments.’(pg.20) The effect of this is that ‘[w]hile compassion provides a motivation to act humanely, legal and ethical judgment provides a means to do so.’(pg.20)

This all sounds well and good, and indeed morally right. But, this is not a legal definition of a legal concept. In terms of legally defining ‘humanity’ we might first look to the Law of Armed Conflict (LoAC) principle of humanity, which suggests that one should not use any more force than necessary for ‘legitimate military purposes.’ (UK Manual, para.2.4) Whilst this conception of humanity exists as a principle in LoAC, it is not necessarily the same thing as ‘the principles of humanity’ within the Martens Clause. Still, Ticehurst accepts this conception as applying to the Martens Clause (Ticehurst, pg.129). Would AWS breach this rule and use more force than necessary? I doubt it. For example, if we imagine an autonomous uninhabited aircraft (drone) is deployed to destroy enemy tanks using anti-tank missiles. There is no reason to think that just because it is autonomous, the drone would fire many missiles when one would do the job. So that isn’t a ground for AWS breaching the Martens Clause.

Meron and Cassesse have similar conceptions of ‘humanity’ as it relates to the Martens Clause, which are different to Ticehurst. Both cite a number of cases and other influential instruments which link to the Martens Clause (Meron, pg.82–83; Cassesse, pg.202–207). For brevity, I’ll put their conclusions together, and not discuss the cases individually. Their analyses suggest that ‘the principles of humanity’ as they relate to the Martens Clause are: the rules of warfare in Common Article 3; the distinction principle; prohibition on attacking civilians; sparing civilians as much as possible; limitations of means and methods of warfare; abiding by the notion of chivalry; prohibition on torture; prohibition on collective punishment.

In terms of Common Article 3, there are two aspects which are problematic for AWS. First, is the prohibition on ‘violence to life and person’ of those hors de combat (‘out of the fight’, those who surrender, are prisoners, or who cannot fight due to incapacitation or sickness). Those who surrender could be protected if a system is programmed to recognise ‘hands up’ gestures (e.g. Samsung SGR-A1) or white flags. Those prisoner in PoW camps, or in military hospitals can be protected through no-strike lists. This leaves us with those who are incapacitated through injury or sickness on the battlefield. One day, AWS may be able to recognise such protected individuals (although this is unlikely). However, attackers need only take such individual into account where it would be reasonable. For example, those prosecuting an aerial or artillery attack would find it difficult, or impossible, to evaluate injuries sustained by the enemy from afar. Thus, if it would be reasonable to use an AWS which cannot evaluate injuries or sickness on the battlefield, then this would be lawful (see AMW Manual Rule 15(b) commentary para.7). Therefore, it is possible that AWS attacks could be lawful under Common Article 3. Of course, ideally, a human operator would take control of an AWS where there is a risk of harming those hors de combat even if this means the system does not perform the operation wholly autonomously.

The African Charter on Human and Peoples Rights

The second aspect of common article 3 that could be problematic is the notion of ‘outrages upon personal dignity’. According to HRW, human dignity is a second principle of humanity (pg.23). One conception that HRW mention (pgs.23–27) is Heyns, who has written on this subject as it relates to armed conflict and law enforcement from a human rights perspective. As Heyns himself notes, the notion of the ‘right to dignity’ is unclear (pg.49). Therefore, despite being provided for in the African Charter on Human and Peoples Rights Art.5 (which HRW mistake as Art.7, see fn.76), due to the lack of clarity as to how this right might apply it is difficult to say for certain that it applies to AWS. Even if it does, it only applies to AWS to the extent that the African charter applies, as neither the charter nor the African Courts have expanded on extraterritorial application we should assume that its applicability extends to states parties within their national borders (for an overview of extraterritoriality as it applies to the African Charter, see here).

Regardless of extraterritoriality, due to the unclear application of the right to dignity, and what it entails as a matter of law, it would seem that the notion as it applies to AWS is a suggestion of lex ferenda (what the law should be) rather than lex lata (what the law is). Thus, as it is not fully formed it cannot be applied as a legal concept and AWS cannot breach it, nor the Martens Clause on this ground. Still, I would agree with this notion in a moral sense. To paraphrase Heyns words, AWS may be ‘lawful, but awful’.

We shall now return to the other aspects which Meron and Cassesse found applied to ‘humanity’ in realtion to the Martens Clause. In terms of the distinction principle and the prohibition against attacking civilians, an AWS programmed only to recognise and attack adverse military targets would abide by these rules. Regarding sparing civilians as much as possible and limitations on means and methods, these duties would normally fall to humans programming the mission parameters into an autonomous system and so is not a ground for AWS breaching the Clause.

In terms of chivalry, this has been confusingly defined in several works. However, Solis considers the concept really well, and essentially concludes that it means to act in good faith (Solis, pg.5–6). Unless AWS are programmed in bad faith, this is not a notion that could lead to AWS breaching the Martens Clause. Finally, torture and collective punishments are things that would need to be decided upon by humans, and do not relate to the autonomous nature of an AWS. This, therefore would not be a ground for AWS breaching the humanity principle.

So, we can see that AWS do not breach the humanity principle according to Meron and Cassesse’s understandings of it, although protecting those hors de combat can be problematic. So, we can now move onto the ‘dictates of public conscience.

In terms of what this means, there is no agreed definition, but it is assumed to be synonymous with public opinion (Meron, pg.83). Indeed, HRW note that ‘“public” clarifies that these dictates reflect the concerns of a range of people and entities’ (pg.28) and I see no reason to disagree. HRW emphasise public opposition to AWS (pgs.30–31), and the groups which disagree with the production and usage of AWS, or find them morally reprehensible (pgs.32–43). However, as Greenwood notes, the notion of ‘public conscience’ is so vague that, not only has it garnered little support, it is impractical to use (pg.129, in Fleck 1995). Therefore, there is no threshold above which one could say there is enough public opinion opposing AWS that they should be deemed unlawful. Thus, despite the considerable opposition which HRW note, we cannot say that AWS go against the ‘dictates of public conscience’ because we simply cannot know what level of opposition is required, and how it should be measured.

In conclusion, despite being morally problematic, AWS do not breach the Martens Clause. Firstly, because there are no customary rules prohibiting the use of AWS. Second, because it by no means clear that AWS will breach the ‘principles of humanity’ as they are legally understood — although safeguarding those hors de combat is problematic in some situations. Finally, the ‘dictates of public conscience’ have no defined limit as to what level of public opposition is required to make a means or method of warfare unlawful, thus we cannot say that AWS go against these dictates. Therefore, there is nothing inherent about the existence or use of autonomous weapon systems which makes them unlawful under the Martens Clause.