Research Article

The commission of crimes using autonomous weapon systems: Issues of causation and attribution

Vasilios Tzoufis*, 1, Nikolaos Petropoulos2

* Corresponding author: btzoufis@gmail.com

1 Faculty of Law, Aristotle University of Thessaloniki, Thessaloniki, GR

2 City University of New York, New York, US

Abstract  Today’s military takes advantage of technological progress primarily through developments in the war industry. Of particular importance is the development and use of advanced autonomous weapon systems, which are changing the way conflicts are fought and redefining the basic strategies on the modern battlefield. The use of these systems raises ethical, philosophical, and legal questions about the relationship of causality and attribution that must exist between an act and its criminal result.

Die Begehung von Straftaten durch autonome Waffensysteme: Fragen der Kausalität und Zurechnung

Zusammenfassung  Das moderne Militär macht sich den technologischen Fortschritt in erster Linie über Entwicklungen in der Rüstungsindustrie zunutze. Von besonderer Bedeutung sind dabei die Entwicklung und der Einsatz fortschrittlicher autonomer Waffensysteme, die die Art und Weise der Konfliktführung verändern und die grundlegenden Strategien auf modernen Schlachtfeldern neu definieren. Der Einsatz dieser Systeme wirft jedoch ethische, philosophische und rechtliche Fragen zum Verhältnis von Kausalität und Zurechnung auf, das zwischen einer Handlung und den strafrechtlichen Folgen bestehen muss.

Keywords  autonomous weapons systems, attribution, causality, criminal law, international humanitarian law

This article is part of the Special topic “Malevolent creativity and civil security: The ambivalence of emergent technologies,” edited by A. Gazos, O. Madeira, G. Plattner, T. Röller, and C. Büscher. https://doi.org/10.14512/tatup.33.2.08

© 2024 by the authors; licensee oekom. This Open Access article is licensed under a Creative Commons Attribution 4.0 International License (CC BY).

TATuP 33/2 (2024), S. 35–41, https://doi.org/10.14512/tatup.33.2.35

Received: 20. 12. 2023; revised version accepted: 24. 04. 2024; published online: 28. 06. 2024 (peer review)

Introduction

One of the basic features of the functioning of the free economy is the production and distribution of as many goods (consumer products) as possible, which is easily deduced from our social experience. Rapid technological development maximizes the potential for increasing productivity through automation. This process and the need for reducing – if not eliminating – risks has led to the research and application of robotics and artificial intelligence systems, whose use in the production process in everyday life raises ethical, philosophical, and legal dilemmas. The use of the most modern applications of technological development was bound to involve the war industry, a branch of industry of vital importance not only for the economy and the balance of trade of many states but also for the defense and the security of states. Modern armies broadly use the exploitation of technology by the war industry: of particular importance is the production and use of advanced autonomous weapon systems (AWS), which are changing the way conflicts are conducted and redefining the basic concepts of strategy in the modern battlefield. With the use of these systems, the human factor takes a back seat, as they are characterized by a greater or lesser degree of autonomy, i.e., the ability to respond to uncertain situations by independently composing and selecting among different courses of action to accomplish goals based on knowledge and a contextual understanding of the world, itself, and the situation (STO 2020, p. 16). This reality poses significant challenges to the violations of international humanitarian law (IHL), given that the affirmation of causality and imputation according to traditional criminal theory becomes more complex than the corresponding violations that occur under conventional weapon systems. For instance, what happens when a system that guides a weapon or that makes a decision on the choice of target or the circumstances of engaging the target violates IHL, for example, by not distinguishing between civilians and combatants or striking the target without having taken precautions to protect civilians or even violates the principle of proportionality, because the expected harm to civilians, or civilian property, is excessive in relation to the expected military advantage? Who will be punished as an IHL offender in this case? What will also happen if an autonomous weapon system already has the specifications from its manufacturer to hit targets in violation of IHL? Can the military user or the manufacturer be punished as the perpetrator? (Anderson and Waxman 2013, pp. 16–17).

Autonomous weapons systems and the risks of violating international humanitarian law

It is commonplace that there is a need for a specific institutional framework for their use, given that the use of AWS has become widespread in recent years: they have been systematically used in warfare in Iraq and Afghanistan (Wall and Monahan 2011, p. 240), as well as in the fight against international terrorism (Ambos and Alkatout 2012, p. 342). These weapons were used for targeted operations and to support conventional battle tactics. However, the war in Ukraine gave a concrete example of the battlefield in the 21st century: widespread use of AWS, attacks against the military (and unfortunately civilian) targets exclusively using drones, precision weapons with built-in cameras, transmitting images to the military user in real-time, more effective target reconnaissance, precision strikes deep into enemy territory, etc. (Thompson 2024). Indeed, it can be argued that “more than any conflict in human history, the fighting in Ukraine is a war of drones” (Mozur and Hopkins 2023).

The use of these systems has already raised several scientific and ethical/philosophical issues already plaguing the literature. The issue concerns states, international organizations, and military alliances, and various attempts have already been made to define the essential characteristics of these systems. Despite the diversity of definitions, which have been given either within the framework of the NATO Alliance or by individual countries (STO 2020, p. 16; Department of Defense 2023, pp. 15–18; Defence Ethics Committee 2021, pp. 3–5; People’s Republic of China 2022, ICRC 2014, pp. 61–64; 2021, pp. 5–6) one can identify the essential characteristics of these systems, which differentiate them from other standard weapon systems. Thus, the autonomy of these systems lies in their ability to select and apply force to targets without human intervention. This means that the user does not choose, or even know, the specific target(s) and the precise timing and/or location of the action. This characteristic presupposes that an autonomous system can understand the battlefield environment and decide a course of action, i.e., it can self-initiate or trigger a strike based on a generalized ‘target profile’.

Therefore, the ethical and legal issues are related to the extent of the possibility of human intervention in the use of these systems. For this reason, the distinctions made by the Ethics Committee of the French Ministry of Defence are fundamental. The committee makes two important distinctions: a) between automated weapon systems, in which automation performs non-critical ‘low level’ functions, and lethal weapon systems in which critical ‘high-level’ decision-making functions are automated. Decision-making and command functions allocated to automation represent the factors of disruption; b) in higher-level weapon systems, there is a difference between lethal autonomous weapon systems, referred to as lethal autonomous weapon systems (LAWS), and partially autonomous lethal weapon systems, which remain under human control, from now on referred to as partially autonomous lethal weapon systems (PALWS). According to this distinction, LAWS are lethal weapon systems programmed to be capable of changing their rules of operation and therefore are likely to depart from the employment framework initially defined. Their software may compute decisions to perform actions without any assessment of the situation by the command. On the other hand, “PALWS are lethal weapon systems integrating automation and software: [a)] to which, after assessing the situation and under their responsibility, the military command can assign the computation and execution of tasks related to critical functions such as identification, classification, interception and engagement within time and space limits and under conditions, [b)] which include technical safeguards or intrinsic characteristics to prevent failures, misuse and relinquishment by the command of two essential duties, namely situation assessment and reporting. […] In other words, a PALWS is a lethal weapon system whose decision-making functions are defined according to the robotics meaning of decision-making autonomy, i.e., within a specific framework of action” (Defence Ethics Committee 2021, p. 4). At the same time: “A PALWS cannot take lethal initiatives that would result in it altering its functional scope” (Defence Ethics Committee 2021, p. 4).

Can the military user or the manufacturer be punished as the perpetrator?

In any war conflict, the fundamental humanitarian problem is the risk of violation of basic principles of the IHL, such as the principle of distinction between civilians and combatants, as enshrined in Additional Protocol I (UN 1977) to the Geneva Convention, Articles 48, 51 and 52, according to which attacks against civilians and civilian objects are prohibited in the conduct of combat operations, and such as the principle of proportionality, as enshrined in Additional Protocol I (UN 1977) to the Geneva Convention, Articles 51 (5) (b), 57 (2) (a) (iii), and 57 (2), and 52, which prohibits attacks which may be expected to cause incidental loss of civilian life, injury to civilians, damage to civilian objects, or a combination thereof, which would be excessive about the concrete and direct military advantage anticipated.

Therefore, in line with what has been mentioned above, attribution of liability for the violation of the above principles of IHL does not encounter particular problems in cases of the use of partially autonomous lethal weapon systems because the relevant issues of causation and attribution of liability in cases of violation of IHL through the use of these systems (PALWS) will be treated in the same way as the violation through conventional weapon systems due to the active role in a more significant or to a lesser extent of the human factor. However, as we will see below, causation and attribution of an act to the attribution of the perpetrator require a radical change in the interpretive tools of traditional criminal theory when the violation is caused by a lethal weapon system (LAWS), the autonomy of which precludes initial or subsequent user intervention. How will responsibility be attributed to a military man who detonates an autonomous weapon system, which the manufacturer already programs not to distinguish between civilians and combatants (impossibility of human intervention even at the initial stage)? How can responsibility be attributed to a military man who fires a weapon system in order to hit a military target but then finds out that civilians are going to be hit, which he could not foresee beforehand, while the autonomous weapon system does not allow him to intervene afterward, e.g., to modify its course, etc. (impossibility of human intervention in the next – after detonation – stage)?

Despite the optimistic view that has been expressed about their use, according to which a modern ‘smart’ weapon system can protect civilians or limit their losses and therefore, army robots may be more humane on the battlefield than humans (Arkin 2010, pp. 334–335), the sad reality of today’s wars confirms the opposite, as in all wars civilian casualties are significant. Therefore, it is perfectly justified to be concerned about the continuous production of weapons capable of killing or injuring civilians en masse and to accept a dystopian approach to the near future with armed robots running amok (Guha and Galliott 2023, p. 64) against civilians.

On the other hand, the use of autonomous weapon systems has made it possible to conduct military operations from afar without the local presence of military personnel involved in the operation, making it challenging to investigate responsibilities in cases of violation of the IHL during the execution of a combat operation. Are states or their secret services exploiting the potential of this technology; it is now a fact that several military operations of states or their secret services use the technology of modern autonomous weapon systems (AWS) to carry out “targeted killings” (Wall and Monahan 2011, p. 242), or “preventive wars” (Fabiano 2023, pp. 214–215), a practice that may involve people who do not have combatant status, but even suspects (Dombrowsky 2013, p. 236) and is therefore criticized for its opposition to ΙΗL (Ambos and Alkatout 2012, p. 359). Not to mention that the Israeli High Court of Justice considers this practice justified only under stringent conditions (Rabi and Plaw 2020, pp. 227–231).

The problem of the use of autonomous weapons systems is compounded by the fact that they are now available to non-state entities.

It should also be pointed out that the problem of the use of autonomous weapons systems is compounded by the fact that they are now available to non-state entities, where prosecution is highly doubtful in the absence of judicial institutions. According to the United Nations, a non-state actor (NSA) is defined as a group of people that has “the potential to employ arms in the use of force to achieve political, ideological or economic objectives; [but] are not within the military structures of states, state-alliances or intergovernmental organizations; and are not under the control of the state(s) in which they operate” (Rutherford 2015, pp. 76–77).

Although NSAs include different categories of armed groups, terrorist groups are perceived as the most notorious and dangerous to both states and civilians, given the widespread use of weapons by them in the international environment. What is particularly problematic about the NSAs activities is an apparent lack of enforcement in the current global judicial system to apply international humanitarian law to NSAs (Meron 2011, pp. 32–35). In other words, although in the context of international law, the International Criminal Court is perceived as the only competent judicial authority that could prosecute perpetrators of war crimes, including NSAs. However, the existing mechanisms and procedures do not appear effective in formulating the execution of the court in this context – i.e. in the case of NSAs (Olásolo 2005, pp. 101–103). Of course, states do enjoy the right to bring perpetrators to justice in their territory when NSAs violate the respective provisions of national law (Porter 2010, p. 198).

Over the last decade or so, several non-state actors across the globe have incorporated drones into their operations with at least five of them operating in the Middle East – namely, Hezbollah, Hamas, the Houthi Movement, Islamic State, and the Kurdish Workers’ Party (Veilleux-Lepage and Archambault 2022, pp. 45–52). Armed unmanned aerial vehicles have also been used in the conflict zones of Ukraine, Myanmar, Mexico, and Ecuador, as well as in central Asia (Haugstvedt 2023, p. 32).

While most rebel groups from South America to the Middle East have extensively used commercially available and low-cost drones purely for surveillance and recognizance purposes, Islamic State of Iraq and Syria became the first terrorist group to use drones for carrying out attacks, demonstrating a more tech-savvy approach in modern warfare (Almohammad and Speckhard 2017, pp. 45–50). More recently, Houthi rebels in Yemen have used drones to attack Saudi oil facilities, demonstrating a high degree of precision in their strikes (Haugstvedt 2023, pp. 33–35). In early December 2023, the rebels threatened to disrupt the global oil supply by deploying drone swarms in certain parts of the Red Sea, targeting oil tankers crossing toward the Suez Canal.

What makes the use of drones by NSAs stand out is the fact that, unlike other forms of military power, air power has traditionally been associated with statehood and sovereignty; the use of drones by non-state actors, therefore, represents an incursion by non-state groups into the prerogative of sovereign states – not only militarily, but also conceptually and symbolically (Archambault and Veilleux-Lepage 2020, p. 78).

A worrisome development related to the use of drones by non-state actors is the fact that while high-end technology was once an exclusive privilege of states with robust military budgets, over time, access to advanced technology has been made more accessible for individuals and non-state actors who can today acquire affordable commercial-off-the-shelf technology to be integrated into attack planning.

Autonomous weapons systems and criminal liability (attribution and causation)

An indispensable element of the concept of crime according to criminal theory, particularly in the context of continental law (Germany, France, Greece, etc.), is the principle of attribution (nullum crimen, nulla poena sine culpa) as a partial principle of the principle of legality of crimes and penalties (nullum crimen, nulla poena sine lege), which also highlights the anthropocentric-personal character of criminal law as the law of the personal-individual responsibility of the specific offender. According to Kantian philosophy, respect for the greatness and value of a man is the basic content of the ethical theory, and man, as a mental character (intelligible character), is self-determined by deciding freely. The principle of attribution is enshrined in the constitutions of most countries. In contrast, in the institutional framework of the Council of Europe, the principle of attribution is derived from Article 7 of the European Convention on Human Rights. In the institutional framework of the EU, the principle of attribution is derived from Article 48 (presumption of innocence) of the Charter of Fundamental Rights of the European Union in conjunction with Article 1 (human dignity).

However, in the case of violation of the IHL through the use of AWS, it becomes challenging to attribute criminal responsibility to the operator-military who operates an autonomous weapon system since, as mentioned above, a vital element of the operation of these systems is the autonomy of ‚decision-making‘, i.e., the ability of these systems to select and strike targets independently of human intervention. Therefore, it is inconceivable to attribute criminal liability to the military operator, who operates an autonomous system but then has no possibility of intervening in it in case of violation of the principles of IHL by the system due to the ‘error’ of the system, since, even if the operator detects in time the possibility of violation of the system, there is nothing he can do to prevent it. On the other hand, in such cases, any criminal liability of the manufacturer of AWS must be examined, which may take the following forms: a) if he acted intentionally, i.e., programmed the weapon system precisely to violate the IHL, e.g., to target civilians, etc., then his liability is apparent, b) if he acted negligently, then it should be judged whether the technical/manufacturing protocols used to build the AWS were safe and appropriate to prevent as far as possible any violations of the IHL, and if they were secure, it should be investigated whether it complied with these specifications.

Given the above, it becomes clear that criminal law does not only need a simple adaptation of its structures to the constantly changing social conditions, but for the first time after about two centuries, it is faced with the question of revising its basic fundamental principles, so that it can respond to its mission, in this case the protection of civilians from violations of IHL. In this context, the scientific debate has opened on whether or not to recognize a quasi-legal personality in robots (Chesterman 2020, pp. 820–822), but the scientific community appears divided; some are completely opposed to considering the capacity for imputation as an element of the person, who develops a moral and emotional relationship with himself and his actions in the present and past (Gless 2017, p. 326), something that robots can never acquire, but others are open to a new approach to criminal theory and a revision of the basic concepts of criminal law, such as culpability (Simmler and Markwalder 2017, pp. 37–47; Dremliuga and Prisekina 2020, p. 260), albeit with conditions (Gleß and Weigend 2015, pp. 576–577).

In addition, using AWS also raises issues related to causality, which must exist between an act and a criminal result. Thus, in the context of the violation of IHL and the infliction of harm on civilians through the possibility of committing criminal acts provided by modern technology through AWS, significant problems may arise in establishing the objective causal link between an act and a harmful effect. The use of autonomous weapon systems, especially the most advanced ones, often requires the cooperation of many people: There are drones controlled by the operator but assisted by another monitoring team. These drones are driven from the battlefield or a station, which can also be another drone. There are also drones with more excellent autonomy capability. In this case, this drone has been programmed in advance, and the operator can intervene – to a greater or lesser extent – to correct the errors. Technological development leads to constructing systems programmed with full autonomy (Balazünbül 2021, pp. 17–22). In the event of a breach of the IHL in such cases, known as cumulative damage cases (‚Summationsschäden‘), because the conduct of several parties causes the damage, it becomes difficult to ascertain the percentage of harm caused by each party, such as the instances of using AWS described above, in which the attack on the target requires the action and/or cooperation of several persons, from the manufacturer, the programmer to the supervisor and the end user.

It is necessary for states to adopt standard uniform rules (protocols) for the operation of factories and companies involved in the manufacture of automatic weapons systems.

As can be deduced from the above, the rapid development of new technologies in the field of the war industry, with the help of artificial intelligence applications, will sooner or later lead to the use of systems with a full degree of autonomy. The modern economic organization has highlighted large corporations as the main pillars of production. Therefore, the war industry, by developing autonomous applications of weapon systems, automatically becomes morally and socially responsible for the risks arising from the use of these technologies. It therefore becomes clear that proving criminal responsibility for the violation of IHL through the use of AWS is primarily a difficult task, because we should check each time the lesser or more significant possibility of user/military intervention during the use of an autonomous weapon system. Moreover, in a few years, the control of human intervention will be irrelevant, since the use of fully autonomous systems is certain. Therefore, it would be more effective to identify specific protocols for the manufacture of these autonomous weapon systems, which would be compatible with IHL and which would become legally binding and their violation (e.g., intentional or negligent programming a system that attacks combatants and civilians indiscriminately) will automatically prove a violation of IHL. Similar legally binding protocols of action should also be adopted in the framework of international conventions to adopt arrangements for the use of autonomous weapons systems under humanitarian protocols, in which all military personnel should be trained under the responsibility of the states that make the relevant international commitments.

Conclusion

From what has been set out above, the need to respect international humanitarian law has emerged from the severe risks faced by civilians due to the use of modern weapon systems. For such protection to be adequate, it is necessary to adopt initiatives initially at the international level. Of utmost importance is the negotiation between states to adopt international conventions that will recognize a commonly accepted definition of AWS and will impose strict restrictions on the use of AWS in military operations, in the sense that their use will be delimited by the principles of the IHL, notably those of proportionality and discrimination between civilians and combatants.

Prima facie, the most effective way to prevent violations of the IHL in using autonomous weapons systems in warfare is to ban them. Apart from the fact that the adoption of similar initiatives at international level would be extremely difficult for reasons primarily political, which have to do with the global power competition between states (Anderson and Waxman 2013, p. 21), such a prohibition would not be relevant from a practical point of view, since the ban on autonomous weapons systems with specific technical characteristics at the present juncture would not legally cover the rapid development of those systems, while a complete ban on any development of autonomous weapons systems would exclude any possibility of manufacturing in the future weapons systems with ethical and legal specifications by the IHL (Anderson and Waxman 2013, pp. 20–22).

However, as mentioned above, it is a given that the development of modern weapon systems will be rapid shortly due to the applications of artificial intelligence. For this reason, it is also necessary for the states to adopt standard uniform rules (protocols) for the operation of factories and companies involved in the manufacture of AWS to ensure that every business, at every stage of production, complies with all the requirements and prohibitions of the IHL (and ethics). This resembles companies’ compliance programs (Reichert 2011, p. 114). It would also be helpful for all AWS to have a built-in camera so that their use can be monitored by the IHL and the principle of transparency (Lieblich 2012, p. 470). After all, the use of autonomous drones is already being evaluated as an effective way to detect IHL violations; the U.S. Agency for International Development announced the delivery of nine autonomous drones to the Office of the Prosecutor General of Ukraine to collect material on possible violations of the IHL by the Armed Forces of the Russian Federation (USAID 2023). In addition, it can be agreed in interstate agreements or agreements between international organizations to adopt minimum ethical values as standards for the construction of AWS (Trusilo 2023 pp. 12–13), such as the development of algorithms capable of superior target discrimination capabilities (combatant – noncombatant status) (Arkin 2010, pp. 340–341), but also the monitoring of states’ compliance with the rules of use of AWS through international organizations or committees (such as the International Humanitarian Fact-Finding Commission; Steinberg and Herzberg 2018, pp. 298–299).

Funding  This work received no external funding.

Competing interests  The authors declare no competing interests.

References

Almohammad, Asaad; Speckhard, Anne (2017): ISIS drones. Evolution, leadership, bases, operations and logistics. Washington, DC: International Center for the Study of Violent Extremism. Available online at https://www.icsve.org/isis-drones-evolution-leadership-bases-operations-and-logistics, last accessed on 29. 04. 2024.

Ambos, Kai; Alkatout, Josef (2012): Has ‘justice been done’? The legality of bin Laden’s killing under international law. In: Israel law review 45 (2), pp. 341–366. https://doi.org/10.1017/S002122371200009X

Anderson, Kenneth; Waxman, Matthew (2013): Law and ethics for autonomous weapon systems. Why a ban won’t work and how the laws of war can. Stanford, CA: Stanford University. https://doi.org/10.2139/ssrn.2250126

Archambault, Emil; Veilleux-Lepage, Yannick (2020): Drone imagery in Islamic State propaganda. Flying like a state. In: International Affairs 96 (4), pp. 955–973. https://doi.org/10.1093/ia/iiaa014

Arkin, Ronald (2010): The case for ethical autonomy in unmanned systems. In: Journal of Military Ethics 9 (4), pp. 332–341. https://doi.org/10.1080/15027570.2010.536402

Balazünbül, Zeynep (2021): Drohnentechnologie und moderne Kriegführung. Bestandsaufnahme und Perspektiven. Berlin: Carl Grossmann. https://doi.org/10.24921/2021.94115956

Chesterman, Simon (2020): Artificial intelligence and the limits of legal personality. In: International and Comparative Law Quarterly 69 (4), pp. 819–844. https://doi.org/10.1017/S0020589320000366

Defence Ethics Committee (2021): Opinion on the integration of autonomy into lethal weapon systems. Paris: Ministère des Armées. Available online at https://cd-geneve.delegfrance.org/IMG/pdf/defence_ethics_committee_-_opinion_on_the_integration_of_autonomy_into_lethal_weapon_systems.pdf?2423/17d8f6beb2f5c9caa9c9168c53c24a91d9d32513, last accessed on 14. 05. 2024.

Department of Defense (2023): DoD Directive 3000.09. Autonomy in weapon systems, 25. 01. 2023. Available online at https://www.esd.whs.mil/portals/54/documents/dd/issuances/dodd/300009p.pdf, last accessed on 29. 04. 2024.

Dombrowsky, Wolf (2013): Risiko und Risikoprävention. In: Monatsschrift für Kriminologie und Strafrechtsreform 2/3, pp. 234–240. https://doi.org/10.1515/mks-2013-962-317

Dremliuga, Roman; Prisekina, Natalia (2020): The concept of culpability in criminal law and AI Systems. In: Journal of Politics and Law 13 (3), pp. 256–262. https://doi.org/10.5539/jpl.v13n3p256

Fabiano, Joao (2023): Should weaponised moral enhancement replace lethal aggression in war? In: Israel Law Review 56 (2), pp. 201–224. https://doi.org/10.1017/S002122372200022X

Gless, Sabine (2017): Von der Verantwortung einer E-Person. In: Goltdammer’s Archiv für Strafrecht 164 (6), pp. 324–32. https://doi.org/10.5451/unibas-ep55627

Gless, Sabine; Weigend, Thomas (2015): Intelligente Agenten und das Strafrecht. In: Zeitschrift für die gesamte Strafrechtswissenschaft 126 (3), pp. 561–591. https://doi.org/10.1515/zstw-2014-0024

Guha, Manabrata; Galliott, Jai (2023): Autonomous systems and moral de-skilling. Beyond good and evil in the emergent battlespaces of the twenty-first century. In: Journal of Military Ethics 22 (1), pp. 51–71. https://doi.org/10.1080/15027570.2023.2232623

Haugstvedt, Håvard (2023): A flying reign of terror? The who, where, when, what, and how of non-state actors and armed drones. In: Journal of Human Security 19 (1), pp. 1–7. https://doi.org/10.12924/johs2023.19010001

ICRC – International Committee of the Red Cross (2014): Autonomous weapon systems, technical, military, legal and humanitarian aspects. Geneva: International Committee of the Red Cross. Available online at https://www.icrc.org/en/download/file/1707/4221-002-autonomous-weapons-systems-full-report.pdf, last accessed on 29. 04. 2024.

ICRC (2021): ICRC position on autonomous weapon systems. Geneva: International Committee of the Red Cross. Available online at https://www.icrc.org/en/download/file/166330/icrc_position_on_aws_and_background_paper.pdf, last accessed on 29. 04. 2024.

Lieblich, Eliav (2012): Show us the films. Transparency, national security and disclosure of information collected by advanced weapon systems under international law. In: Israel Law Review 45 (3), pp. 459–491. https://doi.org/10.1017/S0021223712000155

Meron, Theodor (2011): The making of international criminal justice. A view from the bench. Selected speeches. Oxford: Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199608935.001.0001

Mozur, Paul; Hopkins, Valerie (2023): Ukraine’s war of drones runs into an obstacle – China. In: New York Times, 30. 09. 2023. Available online at https://www.nytimes.com/2023/09/30/technology/ukraine-russia-war-drones-china.html, last accessed on 29. 04. 2024.

Olásolo, Héctor (2005): The triggering procedure of the International Criminal Court. Boston, MA: Martinus Nijhoff. https://doi.org/10.1163/9789047415749

People’s Republic of China (2022): Working paper on lethal autonomous weapons systems, 09. 08. 2022. Geneva: Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons System. Available online at https://docs-library.unoda.org/Convention_on_Certain_Conventional_Weapons_-_Group_of_Governmental_Experts_(2022)/CCW-GGE.1-2022-WP.6.pdf, last accessed on 06. 05. 2024.

Porter, Jean (2010): Ministers of the law. A natural law theory of legal authority. Grand Rapids, MI: Wm. B. Eerdmans Publishing. https://doi.org/10.1111/j.1468-0025.2011.01733.x

Rabi, Shahaf; Plaw, Avery (2020): Israeli compliance with legal guidelines for targeted killing. In: Israel Law Review 53 (2), pp. 225–258. https://doi.org/10.1017/S0021223720000059

Reichert, Jochem (2011): Reaktionspflichten und Reaktionsmöglichkeiten der Organe auf (möglicherweise) strafrechtsrelevantes Verhalten innerhalb des Unternehmens. In: Zeitschrift für Internationale Strafrechtsdogmatik 6 (3), pp. 113–122. Available online at https://zis-online.com/dat/artikel/2011_3_535.pdf, last accessed on 29. 04. 2024.

Rutherford, Tim (2015): Everyone’s accountable. How non-state armed groups interact with international humanitarian law. In: Australian Defence Force Journal 198, pp. 76–82. Available online at https://search.informit.org/doi/pdf/10.3316/ielapa.710945148772352, last accessed on 06. 05. 2024.

STO – Science & Technology Organization (2020): Science and technology trends 2020–2040. Brussels: NATO. Available online at https://www.sto.nato.int/publications/Management%20Reports/2020_TTR_Public_release_final.pdf, last accessed on 29. 04. 2024.

Simmler, Monika; Markwalder, Nora (2017): Roboter in der Verantwortung? In: Zeitschrift für die gesamte Strafrechtswissenschaft 129 (1), pp. 20–47. https://doi.org/10.1515/zstw-2017-0002

Steinberg, Gerald; Herzberg, Anne (2018): NGO fact-finding for IHL enforcement. In search of a new model. In: Israel Law Review 51 (2), pp. 261–299. https://doi.org/10.1017/S0021223718000079

Thompson, Kristen (2024): How the drone war in Ukraine is transforming conflict. In: Council of Foreign Relations, 16. 01. 2024. Available online at https://www.cfr.org/article/how-drone-war-ukraine-transforming-conflict, last accessed on 29. 04. 2024.

Trusilo, Daniel (2023): Autonomous AI systems in conflict. Emergent behavior and its impact on predictability and reliability. In: Journal of Military Ethics 22 (1), pp. 2–17. https://doi.org/10.1080/15027570.2023.2213985

UN – United Nations (1977): Protocol additional to the Geneva Conventions of 12 August 1949, and relating to the protection of victims of international armed conflicts (Protocol I) of 8 June 1977. Available online at https://www.un.org/en/genocideprevention/documents/atrocity-crimes/Doc.34_AP-I-EN.pdf, last accessed on 29. 04. 2024.

USAID – United States Agency for International Development (2023): USAID delivers Skydio autonomous camera drones to Ukraine to document war crimes. In: USAID Press Releases, 27. 07. 2023. Available online at https://www.usaid.gov/news-information/press-releases/jul-27-2023-usaid-delivers-skydio-autonomous-camera-drones-ukraine-document-war-crimes, last accessed on 29. 04. 2024.

Veilleux-Lepage, Yannick; Archambault, Emil (2022): A comparative study of non-state violent drone use in the Middle East. The Hague: ICCT. Available online at https://www.icct.nl/publication/comparative-study-non-state-violent-drone-use-middle-east, last accessed on 16. 05. 2024.

Wall, Tyler; Monahan, Torin (2011): Surveillance and violence from afar. The politics of drones and liminal security-scapes. In: Theoretical Criminology 15 (3), pp. 239–254. https://doi.org/10.1177/1362480610396650

Authors

Dr. Vasilios Tzoufis

is a lawyer (Athens Bar Association) at the Supreme Court of Greece (“Areios Pagos”) and a postdoctoral researcher at Aristotle University of Thessaloniki (Faculty of Law). He has taught at schools of the Greek army and the Greek security forces.

Dr. Nikolaos Petropoulos

is a criminologist with a Ph. D. in criminal justice from the City University of New York. He is also a member of the EENeT Steering Committee.