Assessments of Liability for Violations of International Law involving Lethal Autonomous Weapons Systems: Abstract Entities and Algorithmic Accountability – Helen Stamp

In 1950, the International Military Tribunal sitting at Nuremburg, put forward the following, fundamental proposition, reported in the official proceedings of the Tribunal [466], regarding criminal responsibility for crimes of international concern:

“[c]rimes against international law are committed by men, not by abstract entities, and only by punishing individuals who commit such crimes can the provisions of international law be enforced.”  

This statement was in response to arguments submitted to the Tribunal that it is sovereign states which are governed by international law, rather than individuals, and that individuals are therefore protected from personal responsibility when the act in question is an ‘act of state.’ This submission was strongly rejected by the Tribunal who confirmed that individuals can be held responsible under international law, and specifically the laws of war; a position which has been maintained and has informed the development of international criminal law since that time.

Seventy-one years on, the notion of individual criminal responsibility is again being challenged; not through arguments of state sovereignty but by the very technology incorporated into weaponry which is now being developed and used in modern armed conflicts

In particular, Lethal Autonomous Weapons Systems (LAWS) – where responsibility for decisions is shared between a human operator and to varying degrees an autonomous digital system – have created a challenge to established legal frameworks and accountability mechanisms which would have been unimaginable to those sitting at Nuremburg many years ago. 

Applicable legal frameworks for LAWS

The international community has conducted circular discussions on the regulation of LAWS and the concept of ‘meaningful human control’ (MHC) over the use of these weapons. While little agreement has come out of these discussions to date, there has been agreement that International Humanitarian Law (IHL) and International Criminal Law (ICL) should remain as the legal frameworks to regulate the development and use of these weapons. It would be highly problematic to implement new legal regimes to address these weapons; both in terms of what these would be and, on a more practical level, obtaining agreement between States to implement any new regimes.

There is still international momentum to ban these weapons altogether, however, it is difficult to see how this will be able to overcome the desire of States to acquire these weapons for strategic advantage.

The increased ‘accountability scope’ of LAWS

The technology of LAWS is obviously different to traditional weapons on the battlefield and the ‘accountability scope’ of these weapons also differs. When traditional weapons are produced by a manufacturer, sold to militaries and deployed in armed conflict, they arguably remain static in their functions and capabilities. This then places a greater responsibility on the individual on the battlefield in how they choose to use a weapon in accordance (or not) with applicable legal frameworks.

A defining feature of autonomous weaponry is that it can be developed to make certain decisions independently of human intervention and to ‘learn’ from the environment it is later operating in.  As such, the way that a LAWS may operate when deployed on a battlefield may be influenced by decisions made by the company responsible for developing this, such as particular programming decisions or data sets used. The dynamic nature of such weapons means that there may be others accountable for violations of IHL which occur other than the individual deploying the LAWS in armed conflict. The manufacture of weapons, including LAWS, continues to increase in the corporate domain, and more recently, big tech companies are becoming more involved. Abstract entities, in the form of corporations developing the technology for LAWS, are now a reality and their potential involvement in the commission of international crimes involving a LAWS, together with that of any individual combatants, should be considered.

The (truer) accountability gap

As the scope of potential liability for the development and use of LAWS widens, the gaps in the international legal framework to address this become more prominent. The practicalities, processes and procedures for enabling established legal forums, such as the International Criminal Court (ICC), to competently hear such matters need to be worked through. Agreeing that IHL and ICL will continue to apply is one thing, enabling legal forums to determine violations of these frameworks is another. The truer accountability gap is arguably the ability of established legal forums to competently hear and determine such matters.  

Useful and useable evidence

The workings of autonomous digital systems are not easily understood and have been criticised for their opaque nature. It would be unrealistic to consider that all such technology would accord with and be explainable in legal terms and apply to existing legal standards. Emerging academic commentary suggests that a ‘legally operative explanation’ for autonomous technology is sufficient to bridge this gap. 

In order to have a legally operative explanation, consideration of the technology must occur. The emerging field of algorithmic accountability seeks to obtain information from an autonomous digital system to ascertain if potential liability can be established, and, how this information can be fed into our legal systems as evidence. Technical specifications need to be supplemented with consideration of the actions, reasoning, decisions made, and rationale for these, through each stage of the commissioning and developmental lifecycle of an algorithmic system by those in the company responsible for this. 

This information will only be useful for an international court if it can be used by that court and academic discourse needs to turn to consideration of how this could be improved.

This is necessary for determining criminal responsibility of individuals accused of violations of international law through the use of a LAWS in armed conflict; however, a complete assessment of accountability for such a violation should arguably also consider the entity that developed the technology. Corporate culpability in domestic civil law jurisdictions is actively being considered yet work on the applicability of this for violations of IHL lags behind. The Rome Statute of the ICC has no corporate accountability provisions; there is a compelling need for this omission to be properly examined. In modern armed conflict, the potential for corporate entities to become increasingly involved in crimes against international law is no longer an abstract proposition but a reality that the law must address.

Helen Stamp is a PhD candidate in the Minderoo Tech & Policy Lab at the University of Western Australia, where she researches concepts of control, responsibility, and accountability relating to the development and use of autonomous vehicles and autonomous weapons. Helen has practiced law for over 11 years in civil litigation, community legal work, and as an Operations Lawyer for the Corruption and Crime Commission in Western Australia. Helen has also assisted in the prosecution of war crimes in the Special War Crimes Chamber in Sarajevo and, most recently, worked for eight years as an Adviser in International Humanitarian Law for Australian Red Cross.