The Group of Governmental Experts (GGE) has been discussing autonomous weapons systems (AWS) in the UN arms control context since 2017. Russia boycotted the latest round of talks in Geneva in March, in connection with its 24 February 2022 invasion of Ukraine. Regulation of AWS is an increasingly remote prospect, and some representatives even admit privately that the talks may have failed. The new German government’s commitment to work to outlaw AWS is increasingly looking like a labour of Sisyphus. Given that the GGE requires unanimity, but constructive cooperation with Russia is off the table for the foreseeable, other forums will need to be found for the international debate on AWS control. Germany must prepare for options within NATO, the European Union and the United Nations. It is clear that any meaningful process presupposes coherent coordination with the NATO partners on all levels. In order to achieve that, Germany must first develop a clear national position on AWS.
The first of two GGE meetings on AWS planned for 2022 was held in March in Geneva. Russia used the forum to justify its illegal invasion of Ukraine, which numerous states including Germany sharply condemn. When the Russian delegation made its closing remarks many of the delegates demonstratively left the room.
The same geopolitical tensions that culminated in Vladimir Putin’s war in Ukraine have already caused the de facto failure of the Geneva talks, even if the group will meet again for five days in July. Without Russian buy-in there can be no regulation of AWS through the GGE, which makes its decisions by consensus. All 125 high contracting parties to the Convention on Certain Conventional Weapons are entitled to participate in the GGE, while signatory states such as Egypt also have the right to speak. In reality, only about eighty states actually attend.
Fault lines within the GGE
Even before the Russian invasion it was clear that differences of substance within the Group of Governmental Experts precluded rapid agreement.
First of all, the GGE has failed to agree on a common definition of AWS. Most states support the proposal from the International Committee of the Red Cross (ICRC), under which an AWS is a “weapon system with autonomy in its critical functions” that is capable of selecting and attacking targets without human intervention. China, however, would only include systems possessing the capacity to autonomously modify and/ or expand their strategic mission. Another group of states sees no need to define AWS at all, preferring instead to concentrate on the appropriate level, type, degree and form of man/machine interaction.
France took the initiative in 2021 and presented its own definition, which is also supported by Germany. It distinguishes between fully and partially autonomous systems: Fully autonomous lethal weapon systems are capable of selecting their target and initiating an attack without human intervention, as well as modifying their strategic mission. Germany and France believe that this category of weapons systems should be entirely prohibited. Partially autonomous systems, on the other hand, select and attack targets within a framework defined by human operators, but cannot make more far-reaching decisions on their own. This latter category, France and Germany argue, needs to be regulated in order to ensure that they are used only in accordance with legal and ethical principles. Other countries including Japan have indicated their interest in deepening the discussion on this proposed definition.
There is also disagreement over the terminology used to define the required level of human control. While many participating states prefer the term “human control”, the United States prefers the concept of “human judgment”. The Americans understand human control as meaning direct manual intervention in the weapons system itself, whereas they prefer to control the effects of the weapons. The advocates of human control want to see control over the weapons systems and not merely their effects.
Another fault line is the arms race between the United States, Russia and China, which is especially pronounced in the sphere of new technologies. China in particular has massively increased its military spending to modernise its armed forces, and intends to become “the world AI leader” by 2030. While China does support regulation of AWS in the GGE, its narrow definition raises doubts as to whether it really wants to submit to regulation (presumably in light of its geopolitical ambitions).
The greatest tensions at the moment are obviously those between the United States (or NATO) and Russia. Even if Russia regards “meaningful human control” over AWS as indispensable, it opposes expanding the existing international legal framework. As long as Russia insists on that point it is hard to see a path to agreement in the GGE. This raises the question whether other forums might offer better prospects of progress. Germany should prepare for such scenarios. For example the German Federal Ministry of Defence could prepare a national position on AWS in conjunction with the German Foreign Office.
Defining a national position on AWS
Unlike France, Germany does not yet have a national position on AWS. The German government’s AI Strategy of 2018 touches only superficially on military uses of artificial intelligence. The only real marker is a commitment in the current coalition agreement to actively promote the outlawing of “lethal autonomous weapons systems that are completely beyond the control of humans”. One reason for the lack of a national position might be found in the differences on AWS between the Foreign Office and Defence Ministry. Also Germany prefers reach agreement with important partners, such as the United States, France and the Netherlands, before taking action. In fact both the United States and France have already adopted national positions on AWS; it appears that Germany wishes to wait and see what the coming months bring, and especially how the war in Ukraine progresses.
A national position could include the distinction – made at the 2021 Geneva talks by France and Germany – between fully and partially autonomous systems. It must be noted here, however, that both France and Germany are still only talking about “lethal” weapons systems. This restriction is problematic for two reasons. Firstly, non-lethal weapons can also cause great suffering to combatants and civilian populations. Secondly it is difficult to draw a clear line between lethal and non-lethal weapons.
A national position should also lay out the parameters within which the use of partially autonomous weapons systems is to be permitted. It must therefore necessarily also address the degree of human/machine interaction and the concept of human control. Preparatory studies by the International Panel on the Regulation of Autonomous Weapons (iPRAW), in which the present authors play a leading role, could supply valuable pointers points in this respect. iPRAW understands human control to require “situational understanding and options of intervention during attack”. Both elements must be ensured by the design of the weapons system (control by design) and implementable during its deployment (control in use). The concrete extent of human control will depend on the specific context of utilisation.
In other words, a national position will need to address the various scenarios and identify the associated red lines. Deployment of AWS in urban environments could be subject to tighter restrictions than in purely military settings, for example by making it obligatory for a human to monitor their use and if necessary intervene. Human intervention would be less relevant with autonomous submarines for example.
In light of current geopolitical developments, however, Germany should remain open towards technological innovation and take account of the military benefits of autonomy in its national position. Speed, targeting precision and force protection are some of the examples that mitigate for the use of AWS.
The new geopolitical situation and its influence on Germany’s position on AWS
Calls for Germany to adopt a national position must also be seen in the context of the transformed security environment: In his landmark speech on 27 February 2022, Chancellor Olaf Scholz declared: “We will now – year after year – invest more than two percent of our gross domestic product in our defence.” He also announced that the government would put €100 billion into a special fund for the Bundeswehr, which has now received the Bundestag’s approval. These developments have two principal implications for a national position on AWS:
Firstly, they will boost existing defence projects, such as the Future Combat Air System (FCAS). FCAS is a system of systems bringing together existing and new elements; parts of the FCAS are also to be equipped with autonomous functions. An independent panel of experts on the responsible use of new technologies in the FCAS will ensure that there is no contradiction with the government’s commitment to outlaw weapons systems operating outside human control. One of the central points of discussion within the panel is the concept of human control and its concrete operationalisation.
Secondly, in light of the large increase in defence spending announced by Olaf Scholz, there will need to for carefully scrutiny of whether procurements are actually necessary and sensible. New technologies might offer worthwhile savings on personnel, for example in the form of AI-based data analysis. Although that particular case does not actually involve AWS, the questions thrown up by increasing use of AI are similar, however it is employed.
So the government’s commitment to outlawing AWS does not imply a general rejection of technological innovation in the military sphere. On the contrary, a national position on AWS should include a clear commitment to the necessity for research and development on military technologies, and name the benefits. Only once such a differentiated national position has been defined will Germany be able to participate actively in future negotiations in coordination with its partners.
Alternative forums for regulating AWS
NATO, the EU and the UN are the principal alternative forums for regulating AWS (see figure). It is also conceivable that other states might initiate a process outside the institutions altogether.
NATO – finding a common transatlantic line
Even if regulating AWS falls outside NATO’s remit, it could nevertheless serve to bring the individual NATO partners closer together on the issue and thus strengthening their position in other forums, for example at the UN.
In autumn 2021 NATO published an Artificial Intelligence Strategy whose six principles for responsible use are also relevant for AWS: lawfulness; responsibility and accountability; explainability and traceability; reliability; governability; and bias mitigation. But is says very little on the concrete question of human/machine interaction. In autumn 2022 NATO will publish its Autonomy Implementation Plan, which can be expected to go into greater detail on the human/machine interaction question. It is unclear whether the Autonomy Implementation Plan will principally reflect the American position or also account for the views of the other NATO states. Fundamentally the United States argues for a broader understanding of the degree of autonomy to be permitted in weapons systems. This is also underlined by a paper on good practices in the area of lethal AWS, which the United States submitted to the GGE in March jointly with NATO partners Canada and United Kingdom and close allies Australia, Japan and South Korea.
In 2020 NATO established a working party on human systems integration for meaningful human control over AI-based systems. The working party’s members are researchers who advise on the operationalisation and implementation of human control over AWS on the basis of their experience (rather than representing state positions). One difficulty is that the working party uses the term “meaningful human control”, which in particular the United States rejects.
The working party could make a positive contribution by considering other formulations, without overly altering the terms of the debate or reopening old conflicts. This could also give the United States the possibility to agree definitions with its NATO partners. Even the ICRC deviates from its established terminology in its 2021 position paper on AWS, and now speaks of “human control and judgement”. The ICRC’s position could function as a door-opener expanding the terminological side of the debate – always presupposing agreement can be reached on the aspect of human/ machine interaction.
Finally it must not be forgotten that not all members of the European Union are also members of NATO. For instance Austria and Ireland participate actively in the GGE talks in Geneva and in NATO’s Partnership for Peace (PfP). Through this latter format they could be included in the substantive discussion on AWS. Russia also participates in the PfP, but cannot be expected to contribute meaningfully in the foreseeable future. In light of current events that would not be welcomed either.
An open and informal exchange among NATO states would create a good opportunity to keep the AWS discussion going. But a discussion between NATO states and willing non-NATO states could also contribute to agreeing a shared line. The disadvantage would be that NATO would not be holding a dialogue on AWS with all states, but prioritising cooperation with particular third states.
EU – the “Brussels effect”
Germany should also press for a broader discussion within the EU, where divergences on AWS still exist. If the member states were able to agree on a joint position the EU could lead the way on international standard-setting and expect others to follow (the “Brussels effect”). This has already occurred in the field of cybersecurity. The European Parliament has discussed the issue of AWS several times and calls for legally binding regulation, while the European Defence Fund does not support “actions for the development of lethal autonomous weapons without the possibility for meaningful human control over selection and engagement decisions when carrying out strikes against humans”.
Discussions directed towards finding a shared European position would be especially useful for the Permanent Structured Cooperation (PESCO). PESCO was launched in 2017 to allow groups of willing member states to coordinate activities and forward planning in areas related to the Common Foreign and Security Policy (CFSP). The twenty-five PESCO states are already working on autonomous functions for weapons systems in joint defence projects. For example, in an Estonian-led project Germany and nine other EU member states are developing an unmanned ground system. Similar projects are also working on naval and aerospace technologies, often with German involvement. For example an AI-based unmanned anti-submarine-system is being developed under Portuguese leadership.
The extent of autonomous functions in these weapons systems is not entirely clear from the public data. But closer investigation of these systems and the question of human/machine interaction could certainly supply valuable insights and provide meaningful input for other forums. Even if PESCO’s focus is on arms development, it is nevertheless conceivable and above all productive also to conduct substantive discussions that could subsequently positively influence other processes for example in NATO and the UN.
United Nations – the normative power of resolutions
Even if the GGE has been the central forum for discussions on regulating AWS, it is quite possible that the process could shift to other institutions within the UN. Believing that the GGE has failed, the NGO Campaign to Stop Killer Robots declared in March 2022 that “an alternative process of legal development is now inevitable”. The Latin American, African and Asian states of the G-13 group in the GGE would also like to see other UN institutions becoming involved. Germany needs to be prepared for such developments and should consider whether and on what terms it would participate in such a process. Realistically speaking, two UN forums offer potential alternatives:
The first of these is the UN General Assembly’s Committee on Disarmament and International Security, also known as the First Committee. Its resolutions are normally adopted unanimously, as in the GGE, so states that disagreed would be able to block any decision. However, non-unanimous preparatory processes in the could advance the discussion in the General Assembly, or even lead to the General Assembly itself passing a resolution on AWS. The General Assembly is the other UN forum in which AWS could be discussed. Its decisions are non-binding and made by majority voting, so resolutions can be adopted against the resistance of particular states.
The experience of the GGE talks shows that most states favour regulation of AWS. Even China explicitly underlined its support in a position paper last autumn. However, it has also become apparent that the sticking point is not lack of interest but disagreements concerning the rules for human/machine interaction. One benefit of discussing AWS in the General Assembly could lie in the greater political impact of General Assembly resolutions – although there would still be a real risk of failure to make meaningful progress.
If a General Assembly resolution is to generate political momentum it will need to attract broad support. That could prove problematic in light of the current conflicts. For example China might refuse to vote for a resolution that was not supported by Russia. Obviously any resolution on AWS will need the support of all the major powers if regulation is to be credible and enduring.
Recommendations for Germany
It is very likely that the GGE will be declared a failure after this year’s second meeting in July. If Germany is to live up to its commitment to work to outlaw AWS it needs to develop a strategy for a time after the GGE.
That means preparing a national position on AWS, which should distinguish between fully and partially autonomous weapons systems and provide specific guidance on the requirements for human/machine interaction. It would certainly make sense to include non-lethal systems too. To address the refusal of the United States and others to accept the term “human control”, Germany could sign up to the ICRC approach of speaking of “human control and judgement”. The planned German National Security Strategy could also contain a section on AWS and new weapons technologies, perhaps modelled on the Swiss Arms Control and Disarmament Strategy 2022–25.
Russia’s invasion of Ukraine appears to mark a paradigm shift in the realm of arms control. Universal forums have been successively sidelined, while regional institutions such as NATO and the EU gain in importance. But AWS issues cannot be settled in regional forums, certainly not inclusively. Yet discussing them within NATO and the EU could at least coordinate partners’ stances more closely and in the medium (and longer) term enable them to enter broader future talks with a single voice.
Germany should also prepare for a discussion on AWS in the First Committee, which the G13 states in particular might initiate. A coordinated approach by the NATO partners and the EU member states could contribute to finding broad majorities in the individual UN institutions.
Under current circumstances none of the potential forums guarantees success. But it would be pointless to allow that to paralyse the AWS discussion. The deployment of new technologies in the war in Ukraine underlines the growing importance of autonomy in weapons systems. The announced defence spending plans should give occasion to reflect on the complexity of autonomy in weapons systems, and to develop a national position on AWS. Germany could show the world that it can be counted on to keep its political promises.
Dr. Elisabeth Hoffberger-Pippan is Researcher in the International Security Division, where she heads a project on the International Panel on the Regulation of Autonomous Weapons (iPRAW). Vanessa Vohs and Paula Köhler are Research Assistants in the International Security Division and the iPRAW project. iPRAW is an independent body composed of researchers from a range of disciplines. It receives funding from the German Foreign Office.
© Stiftung Wissenschaft und Politik, 2022
All rights reserved
This Comment reflects the authors’ views.
SWP Comments are subject to internal peer review, fact-checking and copy-editing. For further information on our quality control procedures, please visit the SWP website: https://www.swp-berlin.org/en/about-swp/ quality-management-for-swp-publications/
SWP
Stiftung Wissenschaft und Politik
German Institute for International and Security Affairs
Ludwigkirchplatz 3–4
10719 Berlin
Telephone +49 30 880 07-0
Fax +49 30 880 07-100
www.swp-berlin.org
swp@swp-berlin.org
ISSN (Print) 1861-1761
ISSN (Online) 2747-5107
DOI: 10.18449/2022C43
Translation by Meredith Dale
(English version of SWP‑Aktuell 36/2022)