Jump directly to page content

Preventive Regulation of Autonomous Weapon Systems

Need for Action by Germany at Various Levels

SWP Research Paper 2019/RP 03, 23.01.2019, 24 Pages

doi:10.18449/2019RP03

Research Areas

Anja Dahlmann is an Associate in the International Security Division at SWP.
Dr Marcel Dickow is Head of the International Security Division at SWP.

The authors argue that in order for Germany to do justice to its claim of outlawing lethal autonomous weapon systems (LAWS) internationally, the Federal Government should first define the term “human control”, for example in a strategic document from the Federal Ministry of Defence. The aim should be to facilitate the regulation of the development and use of LAWS – at the international level – thus making the issue of military robotics politically manageable.

The international framework for negotiating the regulation of LAWS is currently the United Nations Convention on Certain Conventional Weapons. A Common Position of the EU member states that demands human control – or, better still, suggests proposals for its design – could have a decisive influence on the negotiations.

Issues and Recommendations

Preventive Regulation of Autonomous Weapon Systems.
Need for Action by Germany at Various Levels

Lethal autonomous weapon systems (LAWS) are weapons that do not require human intervention in order to select and engage a target. This principle is already being used in air defence today, but refers to simple and clearly structured environments. LAWS, on the other hand, would be able to operate in com­plex, dynamic environments.

This has been enabled by recent developments in sensor technology, computing power, and soft­ware capabilities. Especially relevant are mathematic methods, which are often summarised under the term “artificial intelligence” (AI).

Technology determines our everyday life, and we ought to reassess our relation to it constantly. This also applies to the intersection of technology and security policy: The question of the deployment and disarmament of nuclear weapons as well as the ques­tion of offensive and defensive cyber capabilities, that is, security in cyberspace, are representative of the large number of topics that are reflected in social and political debates. With robotics and the application of AI, new technologies are finding their way into cur­rent debates on military and international security policy.

With the possibility of autonomous vehicles roaming Germany’s roads in the future, it becomes obvious that the necessary technology and its use must be sub­ject to certain rules in order to guarantee general safety and compliance with legal requirements. Will this also apply to the use of autonomous weapon sys­tems? This question and the underlying technological, legal, ethical, and security aspects are the subject of debate at the international level.

This study first briefly outlines the extent to which autonomous weapon systems are the subject of politi­cal debate. It then examines the possible effects of the development and use of LAWS. On this basis, it exam­ines the debates on the regulation of LAWS at the in­ter­national, European, and national levels and derives recommendations for action by the Bundestag and the Federal Government. It focusses on three per­spec­tives: the technological-operational, the legal, and the ethical.

From a technological-operational point of view, unmanned weapon systems – especially those with autonomous functions – are important because they change the military approach in combat, for example by requiring fewer personnel during deployment. Con­trol of an unmanned system is not tied to the battle­field; systems with autonomous functions require only one supervisor, who may observe an entire swarm of robots. The elimination of the com­muni­cation link also enables shorter response times and operations in hard-to-reach areas. At the same time, this new tech­nology necessitates the adjustment of military struc­tures and processes in order to take full advantage of opportunities and to minimise possible risks.

International humanitarian law is particularly relevant to the legal considerations of autonomous weapon systems. Here, principles such as the require­ment to distinguish between civilians and combatants (discrimination), the proportionality of means and ends, and the military necessity for the use of force apply. Some considerations can already be made in the run-up to an attack; others must be decided during the actual situation. Especially in dynamic decision-making cycles in the selection of military targets (targeting cycle), legal concerns arise when using LAWS. So far, there have been no technical solutions for the conversion of abstract legal concepts (such as the principle of discrimination) into machine rules – but even if this were possible, the human being would remain the legal subject and must there­fore make the decision. This calls for sufficient human control in the targeting cycle.

From an ethical perspective, LAWS are particularly problematic with regard to human dignity, because robots do not understand what it means to kill a human being. Without this capacity for reflection, how­ever, the human being selected as the target becomes a data point, that is, just an object. The use of autonomous weapon systems would thus violate the dignity of the victim – even technical improvements cannot solve this problem.

Consideration of these technological-operational, legal, and ethical perspectives show that a potential change in the nature of warfare emerges when humans cede the decision to use force – that is, to kill people – to machines. Despite operational ad­vantages, the problematic consequences predominate – human control is indispensable. A consensus for this principle of human control is emerging, both internationally and in Germany, but the concrete form of legal regulations is unclear or controversial.

The Federal Government’s advocacy for a ban on weapons without human control under international law is therefore still called for. It would be helpful for the Federal Government to take a position on how it understands the term “human control”, for example in a strategy document of the Federal Ministry of Defence (BMVg). The aim should be to make it pos­sible to regulate the development and deployment of LAWS – at the international level – thus making the issue of military robotics politically manageable.

The international framework for negotiating the regulation of LAWS is currently the United Nations Convention on Certain Conventional Weapons (CCW). The meetings of the CCW States Parties thus form the forum in which a norm for human control over the use of force should be created. A Common Position of the member states of the European Union (EU) that demands this human control – or, better still, sug­gests proposals for its design – could have a decisive influence on the negotiations.

Autonomous Weapon Systems As the Subject of Political Debate

The debate about the development and use of robots has changed in recent years. Autonomous vehicles are a mainstream topic in German industrial policy, and applications of AI have found their way into commercial and military hardware and software: New smartphones are equipped with hardware for AI applications; algorithms are learning and playing computer games and classic board games better than humans. Data-driven machine-learning opens up the potential for new applications in almost all areas of life. In April and November 2018, the European Com­mission and the German Federal Government pub­lished dedicated AI strategies1 for the first time. It comes as no surprise that national and international political debates have become more intense and multi-faceted. However, they mainly are taking place with regard to civilian use of AI, for example in the German Ethics Commission on Automated Driving. The military implications, on the other hand, are hardly discussed by the general public.

The German Bundestag, too, has not yet decidedly addressed the security policy and international law challenges of military robotics. In contrast, the Bel­gian Parliament, for example, passed a resolution calling for a ban on autonomous weapons in June 2018.2

In September 2018, the European Parliament (EP) adopted its first resolution on autonomous weapon systems, which is not binding for the member states.

Definitions

Where complex issues require scientific, social, and political classification, generally accepted definitions help. A major problem in the debate on the civilian and military use of AI and robotics is precisely the lack of such definitions. This study uses the general term “un­manned military system” (UMS) for any form of military robotic hardware, be it remote-controlled or equipped with autonomous functions. In the context of inter­national discussions on these systems, the United Nations Convention on Certain Conventional Weapons has taken up the term “LAWS” (lethal autonomous weapon sys­tems), which we use specifically for weapon systems with autonomous functions. At the technical level, we do not speak of “autonomous weapon systems”, but of “weapon systems with autonomous functions”, since it is not the degree of autonomy that is relevant, but the functions in which the human being is supported or replaced by the machine.

The more attention the issue draws internationally, the more necessary it becomes for the German Par­liament to address it in order to grasp the security and arms control implications of the technology.

Political reflection on autonomy in weapon systems is still in its infancy in Germany.

After all, the Bundeswehr already includes more automated and partly autonomous weapon systems in its force planning and could be confronted with these systems in alliances or on the battlefield. While the Federal Foreign Office has been helping to shape the international debate since 2013, for example with­in the framework of the CCW, the political debate in the German Parliament has focussed almost exclusively on armed drones. A hearing in the Sub­committee on Disarmament and Arms Control in 2016 was the exception. The approval by the Bundes­tag in the summer of 2018 for the procurement of drones that are able to carry arms (bewaffnungsfähig) leaves numerous technical and organisational ques­tions unanswered (training and education of the armed forces, procurement of ammunition, etc.). It also remains unclear what political impact this procurement project will have on the use of autonomous functions in future generations of aerial weapon systems.

The concept of degrees of autonomy further complicates the debate on military robotics. How exactly individual degrees of autonomy can be defined and which consequences result from it is completely unclear. Robots are neither under complete human control nor fully autonomous. The people who oper­ate them are supported to a considerable degree by assistance systems that control important tasks. This alone blurs the boundaries between terms such as “automated”, “semi-autonomous”, and “autonomous”, which are often used in political debates.

The Development and Use of Weapon Systems with Autono­mous Functions: Political and Military Implications

The technology of unmanned systems can be used for both civilian and military purposes (dual use). Since the civilian market for robotics is considerably larger, many technical components of military applications (e.g. sensors and software) originate from civilian devel­opments. This circumstance not only challenges the civil clause3 of German universities, it also pre­cludes export controls, the prevention of proliferation, and verification: The individual components can be procured on the civilian market and adapted to military purposes with relatively little effort.

“Unmanned systems” or “robots” can mean both remote-controlled and autonomous machines; this study focusses on the latter. In that regard, technol­ogies such as remote-controlled drones are only relevant as precursors. Machine autonomy means that the machine can perform certain tasks without human intervention in dynamic environments.

A prerequisite for this ability are the techniques of AI, a generic term for numerous programming methods. The contents of this field of research are constantly changing but are always based on math­ematical, often statistical, methods. The experts in AI research disagree as to whether, and when, it will be possible to replicate human intelligence. Currently, the competences of software surpass specific human abilities, but they do not form a human-like spirit. The emphasis is therefore more on the artificial aspect than on the intelligence, so the term “artificial intelligence” should be used sparingly and carefully.

The procedures of AI are crucial for the implementation of autonomous functions of machines. Machine-learning4 in particular requires a large amount of rele­vant, (pre-)structured data to train the mathemati­cal models. Since the 2000s, these data can easily be obtained on the Internet for certain applications; the Internet of Things (IoT)5 will further increase the availability of data. However, this does not apply to the same extent to data for the training of machine-learning military systems, since these data cannot be obtained from civilian life, or only to a very limited extent.

Inexpensive storage systems, increased computing power, and big data have enabled machine-learning to enter everyday life. The application of these tech­nologies can now be found in many sectors: in the services sector, in insurance and finance, in civil ser­vice such as the criminal police, but also in science and research. For some years now, AI procedures have also been used in the armed forces, for example in assistance systems on ships or in the evaluation of image data from reconnaissance drones.6

Machine Autonomy and Human Control

From a legal – but especially from an operational – perspective on weapon systems with autonomous functions, the human–machine relationship is of great importance. A crucial determination is the design of the human role during operations, that is, how much autonomy is granted to the weapon sys­tem and what does the human being decide. There­fore, for a long time, the question of a suitable defi­nition of autonomy dominated national and inter­national debates – combined with a fuzzy definition of what exactly LAWS are. The multilateral discussion process on these weapon systems at the CCW reflects this process well: In the CCW debate, and successively also in media coverage, the necessity for human con­trol over such systems has moved to the centre of attention. This concept can be derived directly from international humanitarian law and the necessity for making certain decisions about the use of force. How­ever, the debate is shaped by a few scenarios, such as the fully autonomous drone7 or the mobile combat robot,8 although such scenarios have only limited predictive power with regard to the technical devel­opment and deployment of future LAWS. Of greater relevance are developments in the software sector (especially deep neural networks and AI),9 in swarm systems, and in assistance systems for human–machine teaming. In order to take into account the various technological developments, many actors are calling for a general ban on the development and use of LAWS to prohibit the transfer of the decision to kill to a robot.

However, this focus on the functions of target selection and engagement – coined “critical func­tions” by the International Committee of the Red Cross – leaves a broad grey area. For example, the decision to kill and the execution of this action are not necessarily contained within one single autonomous robot. Instead, they can be performed in a complex weapon system, a so-called system of sys­tems, by different parts. The exact responsibilities may become unclear.

Assistance systems blur the lines between remote control, automation, and autonomy.

Even more problematic are assistance systems such as those already in use in civilian and military appli­ca­tions. They take certain decisions away from people or prepare them and filter the collected data in order to offer options for making decisions. It is question­able whether humans can actually comprehend the information or the way in which these options were created by algorithms. This shift in competence is a creeping development that can be seen as an inevi­table side-effect or internal logic towards (increasing) autonomy. This trend towards autonomy results from technical, but also military considerations.

Technically, the transfer of decision-making author­ity to machines makes sense for two reasons: First, in the case of a remotely controlled UMS, the interruption of the communication link between the station and the device usually leads to an abort of the mission, which can result in delays and dangers for one’s own soldiers. Second, the use of unmanned systems increases the amount of available data, as more and more sensors or units (swarms) are used. Humans can be overwhelmed by this flood of infor­mation, leading to assistance systems already being used today as filters. As the amount of data increases, these systems will be given more and more decision-making authority, while human operators will become supervisors.

Both aspects are also relevant for military consider­a­tions and will be amplified by the increasing speed of operations. This applies when a conflict party deploys LAWS, which can (or at least will soon be able to) react much faster than humans. In order to avoid a militarily disadvantage, the use of autonomous sys­tems seems just as necessary for all others (potential conflict parties), at least for tactical reasons.

The Technical State of LAWS and the Political Perception

We encounter assistance systems every day while driving (e.g. lane departure warning systems, brake assist, navigation), in private life (e.g. on mobile devices, such as Apple’s Siri and Microsoft’s Cortana), and in the work environment (e.g. in production pro­cesses and logistics). For some people, those assistants have become such a natural part of life that they are no longer fully aware of their influence on decision-making processes.

This underestimation is accompanied by a partial overestimation of the cognitive abilities of machines, especially robots. The immense mathematical and algorithmic effort needed to convey a sufficient image of the environment to a robot is usually only known by experts. The fact that the informed and very hu­man-like answers of Alexa and similar programmes are not based on a general machine intelligence but composed of pre-programmed, individual cases usually remains hidden from the user. The limited applicability of the underlying mathematical models becomes apparent only when curious answers are given to questions that contain subtle context.10 Machine intelligence has no understanding of the environment (cognition) and will not for at least the foreseeable future.

These technical limitations also apply to the military use of such systems. In recent years, an incomplete picture of the possibilities, challenges, and risks of LAWS has emerged, which is reflected in German as well as international debates on the regulation of LAWS. In particular, the role of assistance systems in political debates is hardly emphasised, although they are almost ubiquitous in the civilian world and play a major role in determining the functionality of sys­tems. Since assistance systems blur the line between distinctions such as “automated” and “fully autonomous”, leaving a grey area, they should be con­sidered in a regulation (see page 8, last paragraph).

At the same time, CCW States Parties have expressed their expectations that techniques such as AI will improve the implementation of principles of international humanitarian law in the use of force. Such principles are the requirements for discrimination (military versus civilian population), the pro­por­tionality of means and ends, and military necessity. These terms are defined in a legally abstract manner and are context-bound, which makes their implementation in machine rules more difficult – perhaps even impossible. But even if this is achieved one day, the human remains the legal subject and must there­fore make the decision. Such a legal decision requires sufficient human control in the decision-making cycle during the selection of military targets, the so-called targeting cycle.11

The targeting cycle

The decision-making cycle for dynamic target selection by the US military is described here as an example. This con­sists of six steps, namely:

1) Find: Searching for targets that meet initial criteria in designated areas

2) Fix: Identifying, locating, prioritising, and classifying of target

3) Track: Continuous tracking of target

4) Target: Determining desired effect, developing targeting solution, getting legal approval to engage

5) Engage: Strike target with determined and approved weapon

6) Assess: Review the effects of the engagement

In addition, the principles of international law are ethical – and thus human – concepts of the humani­tarian regulation of war and therefore not reproduc­ible by a machine. Nevertheless, planners in armed forces are hoping for algorithms to be developed that can, for example, distinguish between military per­sonnel and civilians better than humans.12 This is often based on the idea that ethical-humanitarian requirements can be translated into machine rules and causal relationships, even if the machine is only intended to support the human in weighing them up.13 This may actually be conceivable in some cases under bounded conditions, for example in simply structured environments without humans present. Such special cases are often generalised, though. Furthermore, the assumption that those special cases exist may be wrong due to changes in the environment or an adversary’s tactics. This aspect has been neglected in the CCW discussions so far.

Following the assumption of the calculability of the world – usually called digitisation and robotisation – it is easily overlooked that a human, and thus its actions, is anything but calculable. This is one of the greatest challenges of human–machine inter­action, whether cooperative (i.e. the machine pro­vides information to its operator in combat) or un­cooperative (i.e. the machine remains intransparent to the opposing side) – this challenge is only vaguely reflected in the political debate.

Effects on the Armed Forces

Robots change military procedures in combat by requiring fewer personnel in the field: In principle, unmanned systems can be controlled from anywhere in the world; with increasing autonomy, a single “operator” is theoretically sufficient to monitor entire swarms of robots. Land robots, in particular, can sup­port soldiers, and they already enable operations that are too dangerous for humans. Nevertheless, robots not only have advantages. Often-cited arguments such as cost-savings, reduced personnel deployment, and greater precision during operation are not always true. In addition, the argument of protecting one’s own armed forces often prevents other aspects from being taken into account, and thus from being weighed up.

Recent research14 has shown that robotics and vul­nerabilities in data rooms and “command and con­trol” infrastructures are linked at both the tactical and strategic levels. In the absence of human oppo­nents, conflict parties could resort to the technological infrastructure of the other side as a target for attack. Typical users of robots, namely technologically advanced states, are particularly vulnerable because they are dependent on these structures.

Growing Data Volumes and Machine‑Learning

Robots, especially learning systems, require a large amount of sensor and training data to function prop­erly. This creates both quantitative and qualitative challenges. First of all, it is generally questionable whether more sensor technology, and thus more data, actually enables more consistent and predictable machine behaviour or better human decisions. At a certain point, the flood of information can unsettle decision-makers, cause inconsistencies, and delay decisions. This is why modern robotic systems use sensor data fusion and information filtering. The choice of filter method is critical for the result and can complicate the attribution of responsibility. If filters influence the information that reaches the operator or commander in such a way without being controllable by humans themselves, it is doubtful whether there is a significant level of control, and thus whether attributable decisions can be made in the field.

The regulation of LAWS has to take into account the data basis of learning systems.

Data-driven algorithms also raise the question of the representativeness and neutrality of the training data. The results of the mathematical methods used depend to a large extent on whether the training data correspond to the application. If the training data is distorted, the algorithms fail in the application and produce unpredictable results. Research on the civil­ian use of training data already shows such limitations today.15

Also unsolved is the question of how military train­ing data for learning systems can be generated to a sufficient extent and representativeness. Although the Internet and the IoT are steady suppliers of pre-structured civilian data, such data sets are lacking for military applications. The creation of synthetic training data could provide a remedy. However, it is also subject to man-made models of the real world. Synthetic training data thus remain erroneous and incomplete, especially when it comes to unpredictable interactions in (real) conflict situations.

This could be a starting point for the regulation of LAWS under international law: to deal (first of all) with the data necessary for weapon systems as well as their acquisition and use.

Military Operations and Structures

Although creating a reliable supply of suitable train­ing data can be difficult, the use of unmanned sys­tems leads to a real flood of information for the user. This has an impact on military structures and deci­sion-making processes.

At the lowest levels, decision-makers could be over­whelmed by the growing flood of information. To avoid the increasing transfer of competences to com­puters (see section on problems with assistance systems, page 10), soldiers must be trained accordingly. The demands on the cognitive abilities of the per­sonnel are thus increasing. This is unproblematic as long as only a small section of the armed forces is using robotic systems. If the machines replace not only older transport systems, but also manned fighter jets, tanks, and ships, there is a need for action: If decisions are to remain comprehensible and controllable for humans, the density of information – and thus the complexity of decisions – will increase. This increase in complexity is changing the demands being placed on humans: Either only highly qualified per­sonnel – who may be difficult to find – can be hir­ed, or tasks that require greater intellectual demands must be bundled and transferred to a higher hierar­chi­cal level. This would change the recruitment strategies of armed forces.

Not only would there be new demands for higher hierarchical levels or technical specialists, but also for cooperation between humans and machines in the field. Even remote-controlled robots are some­times perceived differently in the armed forces than conventional weapons and tools;16 robotic systems with autonomous functions now show that they change communication behaviour and social rela­tion­ships in military units. Sociological research on these phenomena has mainly focussed on civilian fields of application. The results are not directly transferable to the use of weapon systems, but they are significant for the design of the human–machine interface. Com­munication problems arise again and again, even when using the technology for interpersonal commu­nication, despite the corresponding technical lan­guage and fixed communication processes. The inter­actions of soldiers and robots in the field via speech and movement – and not just the operation of the machine – are therefore not only a technical chal­lenge; it must also be taken into account in the train­ing of soldiers.

Due to the technical complexity of robotic systems, armed forces will become even more dependent on private companies in the future. For example, for reasons of safety, the Heron drone leased by the Bun­deswehr may only be launched and landed by the manufacturer’s personnel. The infrastructure for the data transmission of remote-controlled systems is also often privately owned, because a military satellite net­work with sufficient bandwidth would be too ex­pensive. Although these specific cases would not apply to fully autonomous systems, from an operational-military point of view, at least, the option for remote control is necessary. Moreover, dependence on private-sector expertise will not diminish. On the contrary, the influence of civilian enterprises, espe­cially civilian programmers, on military applications is growing. For example, the Federal Ministry of Defence and German defence companies are dis­cussing the possibility of delegating certain tasks in the area of cyber security and defence to civilian companies, because well-trained experts in military high technology are in short supply.

The Influence of Technical Possibilities on Political Decisions

The use of weapon systems with autonomous func­tions raises the question of lowered thresholds for the use of force. The CCW debate shows that some of the technologically advanced states in this field, such as the United States, assess such effects in their armed forces. Even not knowing the results of their analyses, it can be said that States Parties in the CCW emphasise the necessity for human control in the use of LAWS. However, because the extent of human control required has not yet been sufficiently determined, the consequences for warfare itself and for the (political and military) threshold for the use of force remain largely unknown.

This is even more problematic because there have been no published empirical studies on the possible lowering of the military threshold for the use of force. The joystick mentality, much quoted in the 2000s and early 2010s, has rather been replaced by a debate about post-traumatic stress disorders in pilots.17

LAWS could lead to a lowering of the military – and above all the political – threshold for the use of force.

From a political science perspective, however, there are indications that unmanned weapon systems increase the probability of armed conflicts, that is, they lower the political threshold for the use of force. The German political scientists Sauer and Schörnig argue, for example, on the basis of the theory of democratic peace,18 that unmanned military systems appear more attractive to democracies than other weapons, and can thus also lower the threshold for military deployment. The interest of democracies in UMSs lies in the fact that the political actors estimate the actual and political costs to be lower than with other weapon systems. According to Sauer and Schör­nig, it is above all the lower number of losses of their own troops and less – or at least less visible – collat­eral damage that make (remote-controlled) military robots appear more attractive and could, in the long term, lower the political threshold for the use of force.19

In the German discourses on the justification for the use of armed drones, the protection of one’s own troops is the dominant argument, as opposed to new forms of deployment.20 However, both remote-con­trolled and autonomous systems can represent an even stronger dissociation from warfare than in for­mer times – in addition to physical and emotional dissociation, now also intellectual dissociation. Remote-controlled systems have already allowed for operations that would not have taken place using manned systems. The physical removal of soldiers facilitated targeted killings by the Central Intelligence Agency as an essential component of the US strategy in the fight against terrorism. With troops on the ground, their own losses would have been much higher; a permanent presence of fighter jets on the ground would have meant an obvious violation of the sovereignty of third countries. Thus, the technology of armed drones prepares the ground for a strategy that weakens geographical and temporal limits in the fight against international terrorism, and thus ex­tends the interpretation of an armed conflict. On the other hand, the number of visible victims decreases, which shifts conflicts below the publicly perceived or legal threshold, and thus makes it more difficult to control the military.21

The Security Policy Implications of LAWS

UMSs are not only difficult for international humanitarian law to grasp, they can also cause security prob­lems and pose major challenges to efforts towards arms control. Due to the dual-use character described above, the proliferation potential is high, but too strict trade restrictions could, in turn, hamper the devel­op­ment of useful and peaceful civilian technology.

From the point of view of security policy, concerns arise with regard to international stability: With the growing autonomy of unmanned systems, the speed of their actions during operations increases, whereas the predictability of their behaviour decreases, since this is based, for example, on learning algorithms. Mis­judgements by the robotic weapon systems can therefore hardly be corrected, which might lead to an escalation of the conflict in crisis situations.22 In addition, high-tech upgrading leads to a spiralling of arms build-ups – after all, for some states, technologi­cal superiority is at the core of their military doc­trine.23 At the CCW expert meetings that have taken place so far, these aspects have been hardly discussed; human rights have also played a subordinate role in the talks. International regulation on the development and use of LAWS could nevertheless have miti­gating effects on the risks described without explicitly addressing them. For example, certain forms of human control in the targeting cycle would limit the machine speed in combat and could therefore miti­gate escalation risks in military conflict.

The Regulation of LAWS: Status and Perspectives

The United Nations

The intergovernmental debate on robotics is about technological, international, ethical, and security aspects. The focus is on military applications, whereas the regulation of civilian robots is mainly discussed at the national level, partly also at the European level. So far, two forums of the United Nations (UN) have been used for the debate on the military use of autonomous systems: the Human Rights Council and the Convention on Certain Conventional Weapons.

The Special Rapporteur of the UN Human Rights Council, Christof Heyns, addressed LAWS in his report of 2013 and is clearly opposed to its development and use.24

As a result of his concerns, the regulation of LAWS has been discussed since 2014, mainly within the CCW. With regard to the mandate of the CCW, this debate is strongly focussed on international humanitarian law. Although the participants in the meetings have also discussed other topics, such as ethics and international stability, these topics will play a sub­ordinate role in the possible regulation of LAWS.

Such regulation could have different legal and political effects and, for example, be adopted in the form of an additional protocol to the existing con­vention. Comparable additional protocols already exist for weapons with non-detectable fragments, landmines, incendiary weapons, blinding laser weap­ons, and explosive remnants of war ammunition. However, the aim of talks or negotiation processes that are started in the CCW and continued – in this or another forum – is often the subject of political debate itself, as is the appropriate forum itself. For example, negotiations on cluster munitions and anti-personnel mines also began in the CCW, but in the absence of a consensus, the agreements were finally adopted outside the CCW by states that were prepared to do so. It is important that the existing CCW proto­cols do not generally prohibit the aforementioned types of weapons, but merely limit their use in order to ensure compliance with international humanitarian law (in particular for the protection of civilians). To date, the CCW has only achieved a single preventive ban on the use of blinding laser weapons.

Since 2014, the CCW has held three informal meet­ings of experts on LAWS and three meetings of gov­ern­ment experts with representatives of states, non-governmental organisations, and experts. The aim of these meetings was to show the state representatives the technical possibilities – with their advantages and disadvantages – and thus create the basis for an informed debate on a possible regulation of LAWS.

The necessity for regulating LAWS is controversial – but also the subject of regulation.

In the CCW debate, three strands of discussion are central: firstly, the question of the general necessity for international regulation; secondly, the precise de­fi­nition of LAWS as the starting point for regulation; and thirdly, further criteria for possible regula­tion.

With regard to the first line of discussion, it can be stated that the benefits of regulation are already con­troversial. For example, a ban on the development of LAWS would be conceivable, but it could at the same time hinder the civil development of autonomous systems. It therefore seems more practical to restrict the use of LAWS, even if the systems could then be used in individual cases. A “weaker” solution would be national moratoriums on the development of LAWS or a joint political declaration on elements of regulation until comprehensive (international) regula­tion is developed.

Before formulating a definition of LAWS – representing the second focal point in the CCW debate – there is the fundamental question of whether a defi­nition is necessary, or even possible. Many states are of the opinion that a working definition is sufficient for the time being.

In formulating the definition, the difficulties relate to all elements of the term “LAWS”: “lethal”, “autono­mous”, and “weapon system”. The focus of the debate is on the definition of autonomy. If it is very broadly defined, existing systems could also be included and would possibly have to be prohibited. The majority of CCW members, including the Federal Republic of Germany, reject this. One possible solution would be to set a deadline. Unmanned systems that were devel­oped and used before then would not be covered by the prohibition. It is unclear whether software up­dates of existing systems would be allowed, because they could relatively easily increase autonomy with­out necessarily giving more control. As with many considerations concerning the autonomous functions of machines, a verification problem would also arise here if a binding regulation were to provide for a verification mechanism at all.

In order to circumvent this problem at least partially, the concept of “meaningful human control” prevailed as a conceivable criterion of regulation in the course of the CCW meetings. This represents the third strand of the debate within the CCW. This con­cept means that the operator is sufficiently informed about the context of use and can realistically assess and, if necessary, change the actions of the machine and the consequences of the use of weapons during the process of selecting and engaging targets.25 This would also ensure that the necessary humanitarian considerations under international law would be carried out by human beings and not be delegated to the machine, or even neglected.

Some experts in the CCW process have argued that the regulation of LAWS should focus on individual functions of the machine rather than include a com­prehensive definition of autonomy.26 This refers to functions that are necessary for the selection of targets and the use of weapons and are therefore considered particularly problematic for compliance with international humanitarian law (“critical func­tions”). In such decisions, a person must always have control in order to make the necessary judgements under international law regarding the appropriateness of the military means used and the distinction between the military and the civilian population. It is questionable, however, whether other factors are not also relevant for exerting significant human control.27 A consideration beyond international humanitarian law in particular suggests that other characteristics of weapons platforms, such as range, speed, operation time, and armament type, should be taken into ac­count. The analysis of these criteria could, among other things, illustrate the risk of arms dynamics or the escalation of a conflict and serve as a role model. The design of the man-machine interface and the type of automated data evaluation are also relevant for these two phenomena (arms dynamics, conflict esca­la­tion), but their operationalisation is difficult.28

Although the CCW process focusses on internation­al humanitarian law, that is, the legal dimension of LAWS, the ethical dimension also plays a significant role in the debate. On the one hand, written and cus­tomary law are often the result of ethical ideas; on the other hand, there is a reference to the Martens Clause in the preamble of the CCW. It states that custom, public conscience, and the principles of humanity can serve as sources for international humanitarian law, if no other regulation applies.

Some opinion polls try to approach the public conscience empirically; all, however, show inherent methodological weaknesses. Thus, the surveys are generally not representative of the entire world popu­lation, and the questions, whether intentional or not, are often formulated suggestively. In addition, sur­veys do not sufficiently represent the public conscience; they are fuelled, for example, by the media debates and public quarrels on the topic.29

Autonomy in weapon systems would violate the human dignity of the victims.

However, the central question of the ethical debate is the violation of human dignity by autonomous weap­on systems. Dignity is an integral part of human­ity. It presupposes that man is never made an object or becomes a means to an end. If the decision to kill is made in war, it is therefore important that a moral­ly acting person understands and reflects that he/she is taking the life of another person. A machine cannot act morally because it lacks the understanding of mor­tality and the value of life.30 The use of autonomous weapon systems would thus violate the dignity of the victims, whether members of the armed forces or the civilian population – even technical improve­ments/ technological progress cannot solve this prob­lem.

As far as the perspectives of those involved in the CCW process are concerned, it could produce differ­ent results – not all are equally likely. A new Proto­col to the Arms Convention with a legally binding ban on the development and use of LAWS would be the most comprehensive solution – but it is unlikely in view of the progress of the talks. So far, 28 states have spoken out in favour of such a ban,31 but many are sceptical or explicitly opposed to any form of regulation in the CCW. The line of conflict can be clearly identified by the criterion of whether a state has the possibilities and expressed interest in devel­oping and deploying LAWS. The United States, South Korea, Israel, and also Russia are against a ban, where­as many developing and emerging countries are in favour of it.

The consensus principle within the CCW therefore will most likely lead to a compromise solution. This could be made through a political declaration, as pro­posed by Germany and France in 2017.32 It could lay down essential principles, such as those of human control, and formulate in more detail how states should implement them. The “possible guiding prin­ciples”33 adopted by the CCW States Parties in August 2018 do not explicitly exclude such a next step to­wards a political declaration – on the contrary: They represent a first cautious and non-binding attempt towards an agreement.

The diplomats are negotiating under time pressure because, on the one hand, technical development is progressing, and the United States and Australia, for example, are investing significant financial resources into the development of weapon systems with autono­mous functions. On the other hand, if no agreement is reached within the next one to two years, the nego­tiation process could be transferred to another forum (outside the UN). This was already the case with the agreements on anti-personnel mines (entry into force in 1999) and cluster munitions (entry into force in 2010). Although a prohibition treaty outside the UN could also have a normative effect, it would initially have no legal effect on states that are not party to it.

The European Union

The topic of autonomous weapon systems is also being discussed at the European level. In September 2018, the European Parliament passed a resolution34 in which it demanded, by a large majority, a ban on weapons that were not subject to human control during the use of force. The EP is also calling on the European Council to formulate a corresponding Com­mon Position by the EU member states on the CCW process. However, this resolution has no legally bind­ing effect, and the member states must decide for themselves about a Common Position. The very dif­ferent attitudes of individual member states make this more difficult: The United Kingdom opposes regu­lation, whereas Germany and France propose a middle course, and Austria demands a comprehensive ban.

In the EP’s debate on the draft resolution, High Representative of the Union for Foreign Affairs and Security Policy, Federica Mogherini, confirmed the need for common principles for the use of LAWS.35 In particular, the operation must be carried out in accordance with the rules of international humani­tarian law, and decisions on the use of lethal force should always be taken by human beings and not by machines. In her speech, Mogherini referred to an expert group on technology issues (Global Tech Panel), which she set up in spring 2018. The group should provide answers to questions at the intersection of technology and security policy. However, the composition of this expert group – mainly repre­sentatives from the private sector – suggests that LAWS will not play a significant role.36 In an open letter to the High Representative in October 2018, several parliamentary groups in the EP therefore criti­cised the lack of independent LAWS experts from science and civil society.37 The need to regulate LAWS was also underlined by the majority of experts invited to a public hearing in October 2018 by the EP’s Sub­committee on Security and Defence (SEDE) on the role of AI in defence.38

The disagreement of the EU member states regard­ing the development and use of LAWS was publicly demonstrated for the first time in the design of the European Defence Fund. At the request of the EP, it should contain an exclusion list for technology areas that are not eligible for common funding, includ­ing – in the eyes of some parliamentary groups – autonomous weapon systems. In its first version, therefore, it explicitly excluded the promotion of such technologies. The follow-up version of November 2018, which takes into account the position of the European Council, only contains a reference to the necessity that funded research and development must under no circumstances lead to weapon systems that violate existing international law. Meanwhile (March 2019) the final round of inter-institutional negotiations on the regulation establishing the Euro­pean Defence Fund between the EP, the Council, and the European Commission has led to agreed language that excludes “[a]ctions for the development of lethal autonomous weapons without the possibility for meaningful human control over the selection and en­gagement decisions when carrying out strikes against humans”.39 If the European Parliament and Council endorse the agreed text, the European Union will have created a legal instrument that defines LAWS and characterises it as technology that is non-eligible for funding, which corresponds with the EP’s call for banning such technology.

Bundestag and Federal Government

In the German debate on robotics – be it for military or civilian applications – the German government and parliament are taking their first steps. One point of reference is the coalition agreement signed by the Federal Government in 2013, in which the coalition partners express their intention “to work for an inter­national ban on fully automated weapon systems that exclude humans from the decision to the use of force”,40 but also to regulate unmanned systems below this threshold internationally. The coalition agreement between the CDU/CSU and the SPD of 2018 provides for a similar approach, although it uses the more common term “autonomous weapon systems”.41 Foreign Minister Heiko Maas took up this internation­al ban several times, but he made it clear that Germany is pursuing a step-by-step approach through the above-mentioned political declaration with the long-term goal of a binding ban.42

The plenum of the German Bundestag did not deal with the LAWS issue until the end of 2018, but the subcommittee on disarmament and arms control has, most recently in 2015. In November 2018, however, the plenum dealt with the EU Defence Fund and the aforementioned technology exclusion list. An amend­ment tabled by the Bündnis 90/Die Grünen parliamentary group to the effect that the German gov­ern­ment should make every effort to put LAWS back on the exclusion list was referred to the committees and eventually rejected in January 2019.

In the Bundeswehr, a fundamental strategic debate on the pros and cons of weapon systems with autono­mous functions is also pending. While the use of armed, remotely controlled drones meets with ap­prov­al within the Bundeswehr, the view of autonomous systems is a different one: The soldier’s loss of control tends to be viewed negatively. Added to this is the lack of confidence in the cognitive and communicative abilities of future “combat robots”. They would not meet the requirements of the Bundeswehr and would therefore diminish the benefits of this tech­nology.43 The Federal Ministry of Defence does not mention military robotics in the 2016 Defence White Paper. In addition, there is no German (working) defini­tion to define LAWS more precisely and to specify Germany’s position in the international nego­tiations within the CCW.

The civilian use of robots and AI is attracting greater attention and is particularly prominent in the Federal Government’s AI strategy44 of November 2018. The civilian use of robotics is diverse and in­cludes (now or in the near future) industrial robots, home care, autopilots, camera platforms, and delivery services. Especially in connection with the IoT, that is, the networking of objects with people and among each other, many opportunities and challenges arise. In Germany, two developments are being discussed in particular: the use and regulation of small drones for different purposes,45 and autonomous driving.

In Germany, the automotive industry in particu­lar is a driver for (civil) developments in robotics: Autonomous driving has made great progress in recent years.46 However, the legal requirements in many countries, including Germany, are still lagging behind, and most time forecasts have turned out to be unrealistic.47

To assess LAWS, it is necessary, but not sufficient, to consider civilian developments.

It is clear, however, that the debate on autonomous driving shapes the concept of autono­my and the use of robots in the public debate.48 In addition, the civilian sector anticipates possible mili­tary developments and identifies problems. These include the design of the human–machine interface and human control. To this end, the expectations towards autonomous sys­tems or assistance systems – and which of these can realistically be fulfilled – have to be clarified.

In the debate about civil applications, ethical ques­tions are assuming more importance. This exceeds the acquisition of data for learning systems and the respective data protection requirements by addressing crucial issues such as human dignity, which can be violated by machine “decisions”.49 Important actors in the German debate on the ethics of robots in general – and of LAWS in particular – are the Catholic and Protestant churches. They have frequently organised conferences and discussions on this aspect of robotics as well as presented publications. Overall, however, ethical aspects only have a superficial place in the public debate on military robotics, but it is rarely well-founded.

In the field of civilian applications, a change towards a more in-depth examination of ethical ques­tions can be seen: In September 2016, for example, the Federal Ministry of Transport and Digital Infra­structure appointed an Ethics Commission on Auto­mated Driving. It consisted of 14 experts from various fields and published guidelines for the programming and use of autonomous vehicles in June 2017.50 The guidelines deal, among other things, with liability issues and the weighing up of damage in the event of imminent accidents – a well-known dilemma that is given new relevance with the transfer of decisions to machines. The Commission’s final report also men­tions the so-called trolley problem, a thought experi­ment in which a person (or a machine) has to weigh up human lives – but lacks solutions to this prob­lem. It is obvious that, in such cases, a human should make the decision. The implementation into autono­mous weapon systems remains open, since the com­mission does not envisage the transfer of these guide­lines to the military use of autonomous systems.51 However, some of the conclusions also relate to prob­lems of military use, such as the question of human responsibility in the use of certain autonomous func­tions. Neither the Federal Government nor the Bun­des­tag have set up an expert committee to discuss in detail the ethical questions concerning the military use of autonomous systems or the general use of AI in all areas of society. To date, civil and military appli­cations have generally been considered separately.

Conclusions and Recommendations

Towards a National Strategy for the Regulation of LAWS

The political, legal, and ethical questions raised by the development and use of LAWS are urgent and of great importance for the shaping of German security and defence policy. The answers to these questions will be shaped today as well as in the future by the public debate on the civilian applications of robots. If the talks and a possible negotiation process within the framework of the CCW progress, the existing politi­cal definition of the current coalition agreement will continue to point the way, but its content will no longer be sufficient. In order to be able to continue to actively shape the multilateral international process, the following is necessary: The entire Federal Gov­ern­ment, in particular the Federal Foreign Office and the Federal Ministry of Defence (MoD), must deal inten­sively and jointly with the issue of LAWS. A resulting document should fulfil three tasks:

  • First, it should name and answer the questions con­cerning definitions. The German MoD – taking into account the tradition of ethics in the armed forces – appears to be a crucial actor in discussing the impact of technology on the definition of LAWS and vice versa. As a potential user of such weapon systems, it is a necessary prerequisite for the MoD to develop its own definition of these sys­tems – as, for example, in Directive 3000.09 of the US Department of Defense – and thus address the political discussion.52

  • Second, this document should set the political and legal framework for the use of autonomous functions in weapon systems of the German Armed Forces.

  • Third, this would align and limit existing research on military autonomy in a way that would respect existing principles of international law and take into account the emerging norm of human control.

The draft of such a strategy paper of the Federal Gov­ernment on weapon systems with autonomous func­tions could form the basis for a parliamentary debate, and the principles of international law included in this paper could be further legitimised by a resolution. On the way there, however, some hurdles still have to be overcome:

  • First, there is still a lack of reliable knowledge about the underlying technology of robotics in some aspects of the political debate. Where scientific-technological know-how is available, trans­lation work from the technical-academic to the political-discursive space is still necessary. Here it could help to strengthen existing structures at the interface of science, politics, and business or, if necessary, to create new ones. The (military) use of AI in general – and of weapon systems with autono­mous functions, in particular – will represent a political challenge for a long time, not only in terms of regulation.

  • Second, there is often a lack of tools to describe the specific functionalities of technology due to the inter­disciplinary nature of the issue. The language used to characterise robots is often ambiguous, too simplistic, anthropomorphic, and judgemental. In addition, it perpetuates the idea that the systems in question possess human characteristics. Terms such as “decide”, “evaluate”, and “select” describe the purpose of the machines that is intended by humans, but not their actual functionality – and certainly not their capabilities. It is therefore advis­able to find a language that adequately describes this technology and to establish it in the political discourse. The International Panel on the Regulation of Autonomous Weapons (iPRAW), for example, proposes replacing the term “artificial intelligence” with a concrete description of the algorithmic processes used. Although the term “machine-learning” has prevailed in the meantime, it is advisable to speak of “training” and “data-driven algorithms” – or at least to always include this purely technical meaning in one’s thinking.

  • Thirdly, it remains inevitable that developers, mili­tary users, and ultimately political decision-makers will intensively examine the nature and scope of the autonomous functions of weapon systems from ethical as well as international law perspectives. The creeping process towards more and more autonomous functions – assistance systems – re­quires reflection at the political level, including in the German Bundestag. A public hearing of the defence committee could set important priorities and initiate a debate that would also highlight the technical background of the developments and the resulting military consequences. A thorough analysis of the respective human–machine interface is particularly important. This is the only way to ensure that the transfer of decision-making and responsibility to the machine proceeds as desired and that human control in the targeting cycle is maintained.

With the Franco-German working paper of autumn 2017, the German government positioned itself more clearly with regard to a step-by-step procedure for achiev­ing an international regulation of LAWS.53 Time is running out, though, because the technological development increases the pressure to enact regu­lation, while at the same time many states want to exploit the military possibilities of those developments. This is precisely why regulation is necessary – which has been recognised in the German political debate in the meantime.

In light of the self-formulated claim and the normative basis of German foreign policy, in particular in the field of arms control, it makes sense for the Federal Government to continue and intensify its efforts to reach internationally binding rules. A global ban of LAWS as systems without human control is an ethical and legal imperative in view of existing inter­national humanitarian law. Respect for human dig­nity within the meaning of Article 1 of the Basic Law can only be ensured in the use of weapon systems with autonomous functions with this specific course of action, that is, by maintaining human control.

For Germany it is about more than just a few mili­tary advantages that come with the use of such sys­tems, such as – and this is undisputedly a great asset – the protection of its own soldiers. But what is at stake is control over the conflict, which humanity, at least in part, can lose if military conflicts are fought by machines in the future.

The EU As an Important Actor in the Regulation of LAWS

With France and Germany, two diverging national positions clash on the question of regulating LAWS. However, the 2017 Franco-German working paper shows that there is a common basis for regulation: ensuring human control.

Even though the goal of an international negotiation process between these two partners is contro­versial – whereas the political declaration for Berlin represents only a first step, Paris has not yet shown itself to be open to legally binding instruments – the focal point for a Common Position by the EU can be seen here. This must be expanded, sharpened, and then used as a critical compromise within the frame­work of the CCW.

The EU can take accompanying measures to maintain the credibility of a value-oriented European foreign policy. The Federal Government could take action in Brussels on two issues in particular:

  • Firstly, the EU should not fund research, for exam­ple through the European Defence Fund, that con­tributes indirectly or directly to the development of LAWS. This makes it all the more important to promote research aimed at ensuring human control over autonomous weapon systems while still preserving the potential military benefits of such systems.

  • Secondly, it is necessary to critically analyse and politically accompany the advancing technological developments in the civilian as well as military sec­tors. In its first resolution on (civil) robotics of Feb­ruary 2017, the EP calls for a European agency to research the effects of this technology.54 The mandate of such an agency should include interdisciplinary, critical research into the effects of potential military use. The EU, with its market power and political influence, could apply the resulting norms to international standardisation processes and ultimately towards the implementation of international regulation.

The Transformation of CCW Talks into a Negotiation Process

The discussion process within the CCW is increasingly slowing down and under threat of failing due to re­sist­ance from individual states, while the same states continue to push technical development forward. In order to take account of the new challenges related to LAWS and mitigate its negative implications, a timely compromise is necessary. The focus should be on hu­man control over the use of force in order to anchor it internationally as a norm. A politically binding declaration could help, but it poses some challenges. For example, it would leave many important deci­sions at the national level for the time being, even though they have a global impact. In addition, it car­ries the risk of stopping further negotiation pro­cesses. If the CCW States Parties agree on this option, further political pressure is needed to strengthen and shape the principle of “human con­trol over the use of force” internationally.

Abbreviations

AI

Artificial Intelligence

CCW

Convention on Certain Conventional Weapons

EP

European Parliament

EU

European Union

IoT

Internet of Things

iPRAW

International Panel on the Regulation of Autonomous Weapons

LAWS

Lethal Autonomous Weapon Systems

MoD

Federal Ministry of Defence (Bundesministerium der Verteidigung)

SEDE

European Parliament’s Subcommittee on Security and Defence

UN

United Nations

UMS

Unmanned Military System

Endnotes

1

 See European Commission, Communication from the Com­mission to the European Parliament, the European Council, the Coun­cil, the European Economic and Social Committee and the Committee of the Regions: Artificial Intelligence for Europe {SWD(2018) 137final}, 25 April 2018, https://ec.europa.eu/transparency/ regdoc/rep/1/2018/EN/COM-2018-237-F1-EN-MAIN-PART-1.PDF (accessed 22 February 2019); Deutsche Bundesregierung, Stra­tegie Künstliche Intelligenz der Bundesregierung (Berlin, November 2018), https://www.bmbf.de/files/Nationale_KI-Strategie.pdf (accessed 7 December 2018).

2

 See Chambre des Représentants de Belgique, DOC 54 3203/001, 27 June 2018, http://www.dekamer.be/FLWB/PDF/ 54/3203/54K3203001.pdf (accessed 7 December 2018).

3

 The Civil Clause refers to passages in the statutes of German universities or higher education institutions that restrict their research to purely civilian applications. Among other things, questions of dual-use applications are controversial, that is, whether technology development should be carried out even if military use is already apparent.

4

 Machine learning describes a series of mathematical-statistical procedures in which algorithms search for simi­larities or patterns in large amounts of data. For example, they can be used to classify objects, but also to find new rules. A good overview is provided by Ben Buchanan and Taylor Miller, Machine Learning for Policymakers. What It Is and Why It Matters, (Cambridge, MA: Belfer Center for Science and International Affairs, Harvard Kennedy School, June 2017), https://www.belfercenter.org/sites/default/files/files/
publication/MachineLearningforPolicymakers.pdf
(accessed 14 January 2019).

5

 The term “Internet of Things” describes the networking of technical devices, such as household appliances, via data connections with the Internet. Manufacturers promise users better usability and synergy effects in interaction with other devices. At the same time, such IoT devices can collect and transmit a large amount of data. Since they are always on­line, there is also a risk that such devices could be attacked electronically via the Internet.

6

 The Pentagon’s Maven project, in collaboration with Google, attracted particular attention with regard to the evaluation of image data. Under pressure from Google staff and public reporting, Google will not extend this collabo­ration with the Pentagon beyond 2019. See Daisuke Waka­bayashi and Scott Shane, “Google Will Not Renew Pentagon Contract That Upset Employees”, The New York Times, 1 June 2018, https://www.nytimes.com/2018/06/01/technology/ google-pentagon-project-maven.html/ (accessed 7 December 2018).

7

 An example of this is the British Taranis (BAE) Demonstrator Project. The following interview with the Chief Engineer for Armed UAVs at BAE Systems provides an in­sight into the project: Beth Stevenson, “ANALYSIS: Taranis Developers Reveal Test Flight Specifics”, Flight Global, 16 May 2016, https://www.flightglobal.com/news/articles/analysis-taranis-developers-reveal-test-flight-spec-425347/ (accessed 7 December 2018).

8

 The following article gives a brief overview on the devel­opment of a remote-controlled, armed Russian tank: Florian Rötzer, “Russischer Kampfroboterpanzer soll bald von Armee eingesetzt werden”, Telepolis, 30 March 2016, https://www. heise.de/tp/features/Russischer-Kampfroboterpanzer-soll-bald-von-Armee-eingesetzt-werden-3379287.html (accessed 7 De­cem­ber 2018).

9

 For an explanation of “deep neural networks” and “arti­ficial intelligence”, see Marcel Dickow and Daniel Jacob, The Global Debate on the Future of Artificial Intelligence. The Need for International Regulation and Opportunities for German Foreign Policy, SWP Comment 23/2018 (Berlin: Stiftung Wissenschaft und Politik, May 2018), https://www.swp-berlin.org/en/ publication/the-future-of-artificial-intelligence/ (accessed 22 February 2019).

10

 See Tom B. Brown, Dandelion Mané, Aurko Roy, Martín Abadi and Justin Gilmer, “Adversarial Patch”, 31st Conference on Neural Information Processing Systems (NIPS 2017), 17 May 2018, https://arxiv.org/pdf/1712.09665.pdf (accessed 7 October 2018).

11

 For a more detailed discussion of the process of target selection in connection with autonomous functions in weapon systems, see International Panel on the Regulation of Autonomous Weapons (iPRAW), Focus on Technology and Application of Autonomous Weapons (August 2017), https:// www.ipraw.org/wp-content/uploads/2017/08/2017-08-17_ iPRAW_Focus-On-Report-1.pdf (accessed 18 January 2019); Merel Ekelhof, Autonomous Weapons: Operationalizing Meaning­ful Human Control (15 August 2018), https://blogs.icrc.org/
law-and-policy/2018/08/15/autonomous-weapons-operationalizing-meaningful-human-control/
(accessed 18 January 2019).

12

 This position is particularly clearly represented by the United States, see United States of America, CCW/GGE.1/ 2017/WP.6, Autonomy in Weapon Systems, 10 November 2017, 3, https://www.unog.ch/80256EDD006B8954/(httpAssets)/ 99487114803FA99EC12581D40065E90A/$file/2017_
GGEonLAWS_WP6_USA.pdf
(accessed 11 December 2017).

13

 See Ronald Arkin, Governing Lethal Behavior in Autonomous Robots (Boca Raton, 2009).

14

 See United Nations Institute for Disarmament Research, The Weaponization of Increasingly Autonomous Technologies: Autonomous Weapon Systems and Cyber Operations (Geneva, 2017), http://unidir.org/files/publications/pdfs/autonomous-weapon-systems-and-cyber-operations-en-690.pdf (accessed 7 December 2018).

15

 See Anh Nguyen, Jason Yosinski and Jeff Clune, “Deep Neural Networks Are Easily Fooled: High Confidence Predic­tions for Unrecognizable Images”, Computer Vision and Pattern Recognition, 2015, http://www.evolvingai.org/files/DNNsEasily
Fooled_cvpr15.pdf
(accessed 7 December 2018).

16

 See Peter W. Singer, Wired for War: The Robotics Revolution and Conflict in the Twenty-First Century (London, 2009).

17

 With the advent of the first armed drones remotely con­trolled via satellite, the suspicion arose that the long distance between the pilot and the scene as well as the computer-based user interface of the drones could lead to an uninhibited use of weapons. This so-called joystick mentality has not yet been proven. In fact, however, the proximity to the potential human target created by the long observation period, as well as the regular alternation between duty and leisure time, that is, between military and domestic environments, seem to increase the stress level of the crew and lead increasingly to post-traumatic stress disorders.

18

 The theory of democratic peace, which goes back to Immanuel Kant’s Perpetual Peace of 1795, originally assumed that democratic states would not wage wars. Today it appears more likely that democracies will wage fewer wars against each other than against states that are not democratically organised.

19

 See Frank Sauer and Niklas Schörnig, “Killer Drones: The ‘Silver Bullet’ of Democratic Warfare?”, Security Dialogue 43, no. 4 (August 2012): 363–80.

20

 See Deutscher Bundestag, Beschaffung von Kampfdrohnen umstritten (30 June 2014), https://www.bundestag.de/ dokumente/textarchiv/2014/kw27_pa_verteidigung/283434 (accessed 7 December 2018).

21

 So far, political science, for example, has often assumed a definition of war based on the number of deaths (greater than 1,000). If wars with major losses due to UMSs become less frequent, the definition might have to be changed accordingly.

22

 See Jürgen Altmann and Frank Sauer, “Autonomous Weapon Systems and Strategic Stability”, Survival 59, no. 5 (2017): 117–42.

23

 See Jean-Marc Rickli, Some Considerations of the Impact of LAWS on International Security: Strategic Stability, Non-State Actors and Future Prospects (16 April 2015), http://www.unog.ch/ 80256EDD006B8954/%28httpAssets%29/B6E6B974512402
BEC1257E2E0036AAF1/$file/2015_LAWS_MX_Rickli_Corr.pdf
(accessed 17 January 2019).

24

 See United Nations General Assembly, Report of the Spe­cial Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Christof Heyns, A/HRC/23/47 (9 April 2013), http://www.ohchr. org/Documents/HRBodies/HRCouncil/RegularSession/Session
23/A-HRC-23-47_en.pdf
(accessed 7 December 2018).

25

 See Heather Roff and Richard Moyes, Key Elements of Meaningful Human Control, Article 36, April 2016, http:// www.article36.org/wp-content/uploads/2016/04/MHC-2016-FINAL.pdf (accessed 14 January 2019).

26

 See Chris Jenks, The Confusion and Distraction of Full Autonomy – Presentation at the CCW [Informal] Expert Meeting on LAWS, April 2016, http://www.unog.ch/80256EDD006B8954/
%28http
Assets%29/7197832D3E3E935AC1257F9B004E2BD0/
$file/Jenks+
CCW+Remarks+Final.pdf
(accessed 7 December 2018); also International Committee of the Red Cross, Views of the International Committee of the Red Cross (ICRC) on Autono­mous Weapon Systems (11 April 2016), https://www.icrc.org/en/ document/views-icrc-autonomous-weapon-system (accessed 7 December 2018).

27

 On technical and operational factors (and others) see International Panel on the Regulation of Autonomous Weapons (iPRAW), Focus on the Human–Machine Relation in LAWS, (March 2018), 9–13, https://www.ipraw.org/wp-content/uploads/2018/03/2018-03-29_iPRAW_Focus-On-Report-3.pdf (accessed 7 December 2018).

28

 See Marcel Dickow et al., First Steps towards a Multidimensional Autonomy Risk Assessment (MARA) in Weapons Systems, SWP Working Paper (Berlin: Stiftung Wissenschaft und Politik, December 2015), http://www.swp-berlin.org/ fileadmin/contents/products/arbeitspapiere/FG03_WP05_
2015_
MARA.pdf
(accessed 7 December 2018); Anja Dahl­mann, “Getting a Grasp of LAWS? What Quantitative Indicator-Based Approaches Could Bring to the Debate”, in Lethal Autonomous Weapons Systems – Technology, Definition, Ethics, Law and Security, ed. German Federal Foreign Office, (Berlin, 2017), 36–43.

29

 For an overview of the topic, see Human Rights Watch, Heed the Call (21 August 2018), https://www.hrw.org/report/ 2018/08/21/heed-call/moral-and-legal-imperative-ban-killer-robots (accessed 5 December 2018).

30

 See iPRAW, Focus on Ethical Implications for a Regulation of LAWS (August 2018), 12, https://www.ipraw.org/wp-content/ uploads/2018/08/2018-08-17_iPRAW_Focus-On-Report-4.pdf (accessed 5 December 2018).

31

 See Campaign to Stop Killer Robots, Country Views on Killer Robots (22 November 2018), https://www.stopkiller robots.org/wp-content/uploads/2018/11/KRC_CountryViews 22Nov2018.pdf (accessed 7 December 2018).

32

 A draft for this political declaration is not yet available (as of early March 2019); Germany and France have sub­mitted the proposal in a joint working paper and several state­ments during the CCW talks: France and Germany, For Con­sideration by the Group of Governmental Experts on Lethal Autonomous Weapons Systems (LAWS) – CCW/GGE.1/2017/WP.4 (7 November 2017), http://www.reachingcriticalwill.org/ images/documents/Disarmament-fora/ccw/2017/gge/ documents/WP4.pdf (accessed 15 January 2019).
By the end of 2018, however, the United States, for example, had explicitly spoken out against a political declaration of any kind in the CCW, which would be politically binding for it, as for most signatory states.

33

 See 2018 Group of Governmental Experts on Lethal Autonomous Weapons Systems (LAWS), Report of the 2018 Session of the Group of Governmental Experts on Emerging Technol­ogies in the Area of Lethal Autonomous Weapons Systems – CCW/ GGE.1/2018/3 (23 October 2018), https://www.unog.ch/ 80256EDD006B8954/(httpAssets)/20092911F6495FA7C125830E003F9A5B/$file/CCW_GGE.1_2018_3_final.pdf (accessed 14 January 2019).

34

 See European Parliament, European Parliament Resolution of 12 September 2018 on Autonomous Weapon Systems (2018/2752 (RSP)), (12 September 2018), Figures 2 and 4, http://www. europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//TEXT+TA+ P8-TA-2018-0341+0+DOC+XML+V0//EN (accessed 22 February 2019).

35

 See European External Action Service, Autonomous Weap­ons Must Remain under Human Control, Mogherini Says at European Parliament (14 September 2018), https://eeas.europa.eu/ headquarters/headquarters-homepage/50465/node/50465_de (accessed 7 December 2018).

36

 See European External Action Service, About the Global Tech Panel (21 September 2018), https://eeas.europa.eu/ headquarters/headquarters-homepage/50886/about-global-tech-panel_de (accessed 7 December 2018).

37

 See Reinhard Bütikofer et al., Letter to Federica Mogherini (17 October 2018), https://reinhardbuetikofer.eu/wp-content/ uploads/2018/11/Letter-to-HR-VP-on-autonomous-weapons-and-civil-society-17_10_2018.pdf (accessed 7 December 2018).

38

 Further information on the opinions of the invited ex­perts in the hearing at the SEDE on 10 October 2018: Euro­pean Parliament: http://www.europarl.europa.eu/committees/ en/sede/publications.html (accessed 7 December 2018).

39

 Council of the European Union, Proposal for a Regulation of the European Parliament and of the Council Establishing the Euro­pean Defence Fund (First Reading) – Progress Report (6733/1/19 REV 1), (1 March 2019), 7, https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CONSIL:ST_6733_2019_REV_1&
from=
EN
(accessed 7 March 2019).

40

 CDU/CSU/SPD, Deutschlands Zukunft gestalten. Koalitions­vertrag zwischen CDU, CSU und SPD, 18. Legislaturperiode (2013), 124, https://www.cdu.de/sites/default/files/media/dokumente/ koalitionsvertrag.pdf (accessed 15 January 2019).

41

 CDU/CSU/SPD, Ein neuer Aufbruch für Europa. Eine neue Dynamik für Deutschland. Ein neuer Zusammenhalt für unser Land. Koalitionsvertrag zwischen CDU, CSU und SPD, 19. Legislaturperiode (2018), 149, https://www.cdu.de/system/tdf/media/dokumente/ koalitionsvertrag_2018.pdf (accessed 15 January 2019).

42

 See Auswärtiges Amt/Heiko Maas, “Die Zukunft der nuklearen Ordnung – Herausforderungen für die Diplo­matie” (27 June 2018), https://www.auswaertiges-amt.de/ de/newsroom/maas-fes-tiergarten-konferenz/2112704 (accessed 18 January 2019); Auswärtiges Amt/Heiko Maas, “Wir müssen über Abrüstung reden”, 3 November 2018, https://www.auswaertiges-amt.de/de/newsroom/maas-spiegel-online-inf/2157268 (accessed 18 January 2019).
In addition, the Federal Foreign Office financially supports the project The International Panel on the Regulation of Autonomous Weapons at the Stiftung Wissenschaft und Politik, Berlin. iPRAW is an interdisciplinary group of inter­national academics and contributes to the CCW process with various reports on LAWS (https://www.iPRAW.org).

43

 See Jörg Wellbrink, “Mein neuer Kamerad – Haupt­gefreiter Roboter?”, Ethik und Militär, no. 1 (2014): 52–55.

44

 See note 1.

45

 See Bundesministerium für Verkehr und digitale Infra­struktur, Klare Regeln für Betrieb von Drohnen (2017), https:// www.bmvi.de/SharedDocs/DE/Artikel/LF/151108-drohnen. html (accessed 7 December 2018).

46

 See Stefan Krempl, “‘Hochautomatisiertes’ Fahren bis 2020 realisierbar”, heise online, 21 November 2015, http:// www.heise.de/newsticker/meldung/Hochautomatisiertes-Fahren-bis-2020-realisierbar-3009915.html (accessed 7 December 2018).

47

 See Fred Lambert, “Elon Musk Updates Timeline for a Self-driving Car, But How Does Tesla Play into It?”, electrek, 8 December 2017, https://electrek.co/2017/12/08/elon-musk-tesla-self-driving-timeline/ (accessed 15 January 2019).

48

 See, e.g., the study commissioned by the Federal Minis­try of Transport and Digital Infrastructure in autumn 2015. Although it does not deal with social consequences, they are addressed in the media reaction. See among others Matthias Breitinger, “Der Nutzer wird’s schon annehmen”, Die Zeit (online), 21 September 2015, http://www.zeit.de/mobilitaet/ 2015-09/autonomes-fahren-vernetzung-projekt/komplett ansicht (accessed 7 December 2018).

49

 For further information on the need to regulate artificial intelligence, see Dickow and Jacob, The Global Debate on the Future of Artificial Intelligence (see note 9).

50

 See Udo Di Fabio et al., Bericht der Ethik-Kommission Automatisiertes und Vernetztes Fahren (June 2017), https:// www.bmvi.de/SharedDocs/DE/Publikationen/DG/bericht-der-ethik-kommission.pdf?__blob=publicationFile (accessed 15 January 2019).

51

 On the technology assessment regarding UMS and LAWS, see Office for Technology Assessment at the German Bundestag (TAB), Status quo and perspectives of the military use of unmanned platforms Mai 2011, https://www.tab-beim-bundestag.de/en/research/u139.html (accessed 7 December 2018; TAB, Autonomous Weapon Systems (2017), https://www. tab-beim-bundestag.de/en/research/u30600.html (accessed 7 December 2018).

52

 See Department of Defense, Directive Number 3000.09, November 21, 2012 Incorporating Change 1, May 8, 2017, Autono­my in Weapon Systems (21 November 2012; revised version as of 8 May 2017), http://www.esd.whs.mil/Portals/54/Documents/ DD/issuances/dodd/300009p.pdf (accessed 7 De­cember 2018); for further ideas on this recommendation, see Daniele Amo­roso et al., Autonomy in Weapon Systems. The Mili­tary Application of Artificial Intelligence as Litmus Test for Germany’s New Foreign and Security Policy (Berlin: Heinrich Böll Foun­dation, 2018), 48–49, https://www.boell.de/sites/default/files/ boell_ autonomy-in-weapon-systems_v04_kommentierbar_1.pdf? dimension1=division_oen.

53

 See note 31.

54

 See European Parliament, European Parliament Resolution of 16 February 2017 with Recommendations to the Commission on Civil Law Rules on Robotics (2015/2103(INL)), (16 February 2017), Figure 16, http://www.europarl.europa.eu/sides/getDoc.do? pubRef=-//EP//TEXT+TA+P8-TA-2017-0051+0+DOC+XML+V0//EN (accessed 22 February 2019).

All rights reserved.

© Stiftung Wissenschaft und Politik, 2019

SWP Research Papers are peer reviewed by senior researchers and the execu­tive board of the Institute. They are also subject to fact-checking and copy-editing. For further information on our quality control pro­cedures, please visit the SWP website: https:// www.swp-berlin.org/en/ about-swp/quality-management-for-swp-publications/.

SWP Research Papers reflect the views of the author(s).

SWP

Stiftung Wissenschaft und Politik

German Institute for International and Security Affairs

Ludwigkirchplatz 3–4
10719 Berlin
Germany
Phone +49 30 880 07-0
Fax +49 30 880 07-200
www.swp-berlin.org
swp@swp-berlin.org

ISSN 1863-1053

(Updated English version of SWP‑Studie 1/2019)