Majority of Canadians Oppose Killer Robots

New poll indicates that 55% of Canadians oppose autonomous weapons systems

This week, Ipsos released results from the first global public opinion survey that included a question on autonomous weapons. Autonomous weapons, sometimes called killer robots, are future weapons that could select and fire upon a target without human control. Ipsos found that 55% of Canadians surveyed opposed autonomous weapons while another 25% were uncertain about the technology.

In the survey 11,500 citizens across 25[1] countries were asked “The United Nations is reviewing the strategic, legal and moral implications of autonomous weapons systems. These systems are capable of independently selecting targets and attacking those targets without human intervention; they are thus different than current day ”drones” where humans select and attack targets. How do you feel about the use of autonomous weapons in war?” In all but five countries (France, India, the US, China and Poland), a clear majority are opposed to the use of autonomous weapons in war.

Among Canadians, 21% of respondents reported being somewhat opposed to autonomous weapons in war while 34% were strongly opposed to the technology being used in war. Only 5% of Canadians surveyed were strongly supportive of using autonomous weapons in war. This survey is the first to poll Canadians on autonomous weapons systems.

“As part of the Campaign to Stop Killer Robots, we have frequently heard from Canadians that they want to ensure that there is meaningful human control over weapons at all times. This survey confirms that those opinions represent the majority of Canadians,” said Paul Hannon, Executive Director of Mines Action Canada, a co-founder of the Campaign to Stop Killer Robots, “In addition to strong citizen opposition to the use of autonomous weapons in war, Canada also has the first robotics company in the world to vow to never build autonomous weapons, Clearpath Robotics. It is time for the Canadian government to catch up to the citizens and come up with a national policy on autonomous weapons.”

Mines Action Canada is calling on the Government of Canada to ensure meaningful public and Parliamentary involvement in drafting Canada’s national position on autonomous weapons systems prior to the United Nations talks on the subject later this year.

– 30 –

Media Contact:  Erin Hunt, Program Coordinator, Mines Action Canada, + 1 613 241-3777 (office), + 1 613 302-3088 (mobile) or erin@minesactioncanada.org.

[1] Argentina, Belgium, Mexico, Poland, Russia, Saudi Arabia, South Africa, South Korea, Sweden, Turkey, Hungary, Australia, Brazil, Canada, China, France, Germany, Great Britain, India, Italy, Japan, Spain, Peru and the United States of America.

 

A pivotal year ahead

Originally published on the Forum on the Arms Trade’s Looking Ahead blog, Erin Hunt looks at opportunities and challenges ahead in 2017 for efforts to preemptively ban autonomous weapons systems.

2017 has the potential to be a pivotal year in efforts to ensure that all weapons have meaningful human control. For three years, the Convention on Conventional Weapons (CCW) has been discussing lethal autonomous weapons (future weapons that could select and fire upon a target without human control). In December 2016, the Review Conference of the CCW decided to establish a Group of Governmental Experts (GGE) chaired by Ambassador Amandeep Singh Gill of India which will meet over 10 days in 2017 and then report-back to the CCW’s annual meeting on 22-24 November.

A GGE is a more formal level of meetings than the ones held in 2014, 2015 and 2016. States will be expected to bring their own experts and participate actively in discussions, instead of listening to presentations by outside experts and asking questions of those experts. The first meeting of the GGE will be held at the UN in Geneva on either 24-28 April or 21-25 August 2017. The date is dependent on when funds are available for the meeting. The second meeting of the GGE will be on 13-17 November, just before the annual CCW meeting.

In 2016, the number of states calling for a pre-emptive ban on fully autonomous weapons more than doubled.  At the time of writing, Algeria, Argentina, Bolivia, Chile, Costa Rica, Cuba, Ecuador, Egypt, Ghana, Guatemala, Holy See, Mexico, Nicaragua, Pakistan, Panama, Peru, State of Palestine, Venezuela and Zimbabwe have called for a ban while a number of other states seem to support new international humanitarian law of some sort to deal with autonomous weapons systems.

This GGE is a large step towards a pre-emptive ban on autonomous weapons systems but there are a number of challenges ahead in 2017.  First, the Russian Federation continues to object to more formal talks on autonomous weapon systems on the grounds that it is premature to move forward since there is not a clear understanding of the subject under discussion. That objection forgets that definitions are usually the last part of disarmament treaties to be negotiated. It was only at the very end of the 2016 CCW Review Conference that Russia agreed to not block the GGE.

Second, the majority of states, including my own, Canada, do not have national policies on autonomous weapons systems.  However, this challenge is also an opportunity. The Campaign to Stop Killer Robots will be working hard around the world in 2017 to support the development of national policies on autonomous weapons systems.  After three years of informal CCW experts meetings as well as discussions in the Human Rights Council, states have a large amount of information at their disposal to begin to craft national policies. States can also hold consultations on creating a national policy in advance of the GGE meetings.

Third, there is the possibility that the GGE may become distracted by the inclusion of a discussion item on best practices and greater transparency in Article 36 weapons reviews. These legal reviews are an obligation of states developing, purchasing or otherwise acquiring new weapons.

Although Article 36 weapons reviews should be a topic of discussion at the international level to strengthen both policy and practice around the world, better weapons reviews will not solve the problems associated with autonomous weapons systems and should not distract the GGE from the core of its work. Weapons reviews cannot answer moral, ethical, and political questions. An Article 36 review cannot tell us if it is acceptable to the public conscience for a machine to kill without meaningful human control. Autonomous weapons systems are often referred to as a revolution in warfare; and as such, moral, ethical and political considerations must not be pushed aside. These questions need to remain on the international agenda in 2017.

This year, we will witness significant work done at the national and international level to increase understanding of the challenges posed by autonomous weapons as well as the number of states calling for a pre-emptive ban. Stay tuned to see if the international community stands ready at year’s end to ensure that all weapons have meaningful human control.

Success at CCW!

The 5th Review Conference of the Convention on Conventional Weapons (CCW) wrapped up today in Geneva and we’re very pleased that states agreed to hold two weeks of formal meetings in 2017 to discuss autonomous weapons. This Group of Governmental Experts (GGE) is the next step towards new international law about autonomous weapons. The international Campaign to Stop Killer Robots has a comment on the GGE decision online.

It’s been a busy week at CCW, Mines Action Canada delivered a statement in the General Debate and then we worked with our campaign colleagues to shore up support for the GGE.

So you didn’t miss out on any of the week’s events, we’ve created daily recaps in both Storify and video format. This week marks the start of a whole new phase of our efforts to ban killer robots. Donate today to support our work.

Day One – Storify and Video update

Day Two  – Storify and Video update

Day Three – Storify and Video update

Day Four – Storify and Video update

Day Five – Storify and Video update

CCW RevCon Statement

Thank you Chair. I appreciate the opportunity to speak on behalf of Mines Action Canada.

Although today we are starting the 5th Review Conference of the Convention on Conventional Weapons, we must spend our time looking forward. We are entrusted with preventing humanitarian harm from existing weapons like incendiary weapons and from future weapons that will require new legal instruments to avoid catastrophes to come.

CCW has spent three years holding informal meetings about autonomous weapons systems. At times during those discussions, we have felt that some have underestimated the skills, knowledge, intelligence, training, experience, humanity and morality that women and men in uniform combine with situational awareness and IHL to make decisions during conflict. We work closely with roboticists, and engineers, but despite their expertise and the high quality of their work, we do not believe an algorithm could replicate this complex and very human decision making process. Robotics should only be used to inform and supplement human decision making.

In the CCW’s work on autonomous weapons systems, we have learned more about Article 36 reviews but it is clear that states need to be more transparent, systemic and rigorous in their weapons review processes. Mines Action Canada believes that Article 36 weapons reviews should be a topic of discussion at the international level to strengthen both policy and practice around the world.

However, better weapons reviews will not solve the problems associated with autonomous weapons systems. For example, Article 36 reviews are not obligated to cover weapons used for domestic purposes outside of armed conflict such as policing, border control, or crowd control. Most importantly, weapons reviews cannot answer moral, ethical, technical and political questions. An Article 36 review cannot tell us if it is acceptable to the public conscience for a machine to kill without meaningful human control.

It is time for a separate effort to strengthen the standards and transparency around weapons reviews. That effort must neither distract from nor overtake our work here to deal with the real moral, legal, ethical and security problems associated with autonomous weapons systems. Weapons reviews must be grounded in new and robust international law that clearly and deliberately puts meaningful human control at the centre of all weapons development.

The concerns raised by autonomous weapons are urgent and must take priority. If we wait until everyone has a clear understanding of every aspect of the issue to start a Group of Governmental Experts the window of opportunity to prevent humanitarian harm from autonomous weapons will close. A GGE will allow high contracting parties to develop their understanding of the issue and to pursue effective outcomes.

In Canada, particularly, this year’s defence review offered an opportunity for the government to hear from a number of experts on autonomous weapons systems. A GGE next year would give Canada the opportunity to share the results of that process and to contribute our collective understanding of the issue.

Mines Action Canada, as a co-founder of the Campaign to Stop Killer Robots, believes that the way forward must lead to a pre-emptive ban on autonomous weapons systems as a tool to prevent humanitarian harm without damaging research and development on autonomy and robotics for military or civilian purposes. Earlier this year, a Canadian robotics expert made it clear there are no other applications for an autonomous system which can make a “kill or not kill” decision. The function providing an autonomous weapon the ability to make the “kill decision” does not have an equivalent civilian use therefore, pre-emptive ban on autonomous weapons systems would have no impact on the funding of research and development for artificial intelligence.

As experts at the meeting in April made clear our window of opportunity to prevent future humanitarian harm from autonomous weapons will not stay open long so we need to be moving forward at this Review Conference. Therefore, we urge states to accept the recommendation for an open-ended Group of Governmental Experts next year.

Gearing up for the Review Conference

The Convention on Conventional Weapons (CCW) Review Conference in December will decide if they will hold a Group of Governmental Experts (GGE) meeting on autonomous weapons systems in 2017. A GGE is the logical next step in the work to address concerns about autonomous weapons systems (or killer robots).

The Campaign to Stop Killer Robots is getting ready for the Review Conferenc20161012_154323e here in Canada and around the world.  Check out our colleagues at Reaching Critical Will for an update on the Preparatory Meeting of the CCW to see how the international preparations are going.

On the Canadian side, our Program Coordinator, Erin Hunt, was pleased to deliver the Campaign’s statement to the United Nations General Assembly’s First Committee on October 12.

Over the next month and a bit, we will be talking with parliamentarians, civil society and academics to help ensure that Canada takes a strong position at the Review Conference and beyond. You can help by writing your MP to ask that Canada outline a national policy on autonomous weapons or by donating online to support our work.

 

Interview with Bloomberg TV

This summer, our Executive Director, Paul Hannon, spoke with Bloomberg TV about autonomous weapons systems. You can see the whole interview here.

 

December is decision time at CCW

For the third year the Convention on Conventional Weapons (CCW) met in Geneva to work on autonomous weapon systems. From April 11th to 15th 2016, the informal experts meeting held at the United Nations addressed a wide range of issues and concerns. The meeting was attended by 94 member and observer States to the 1980 Convention on Conventional Weapons. This marked the highest turnout yet reflecting the increasing awareness and growing concern about these future weapons.MAC placard

The CCW is a framework treaty that prohibits or restricts certain conventional weapons deemed to be excessively injurious or to have indiscriminate effects. The meeting included members of UN agencies, including the UN Institute for Disarmament Research, and the International Committee of the Red Cross (ICRC). The Campaign to Stop Killer Robots, with a diverse and passionate group of 40 campaigners from 10 countries, was active and provided very substantive input and expert analysis into discussions on issues such as meaningful human control and the weaknesses of Article 36 weapon reviews among other topics.

As a co-founder of the Campaign to Stop Killer Robots, Mines Action Canada (MAC) supports a pre-emptive ban on lethal autonomous weapons systems before they are ever developed or used. MAC was well represented at the meetings and gave statements about the repercussions of allowing such weapons to be developed. Executive Director Paul Hannon’s statement talked about the need to implement a ban on these systems before a humanitarian catastrophe occurs as well as the growing support for a pre-emptive ban. Project Coordinator Erin Hunt’s statement detailed the limitations of weapon reviews for lethal autonomous weapons systems and the importance of prioritizing a ban on these systems. Project Ploughshares, a new Canadian member of the Campaign to Stop Killer Robots, also participated actively.

Encouragingly, five nations Algeria, Chile, Costa Rica, Mexico, and Nicaragua called for a pre-emptive ban on lethal autonomous weapons systems bringing the total to  14 states that now support this goal. Throughout the meeting, Cuba, Ecuador, the Holy See and Pakistan as well as all NGO speakers reiterated the need for a ban on lethal autonomous weapons systems.

Although many nations, including Canada, were not willing at this time to support a ban on lethal autonomous weapons systems, the importance of meaningful human control was underlined many times during the 5-day meeting. The Netherlands announced new policy in support of meaningful human control over deployed weapon systems. As well, Austria made note of the recommendation by two UN Special Rapporteurs in their February 2016 report to prohibit lethal autonomous weapons systems that require no meaningful human control. The report can be found here with reference to lethal autonomous weapons systems on page 17.

The common area of concern regarding lethal autonomous weapons systems is the lack of ability to properly target and apply force in a strike. Without human control over key points of identification of targets and approval of a strike, innocent civilians could be inappropriately targeted and killed. It is important to maintain meaningful human control over critical functions of lethal autonomous weapons systems to ensure proper application of international humanitarian law as well as accountability were a target later to be understood as inappropriate. These qualifiers cannot be guaranteed with lethal autonomous weapons and thus threaten international security and stability.

The proliferation of lethal autonomous weapons systems risks the development of an arms race and a weakening of global security. Prohibitions and tight restrictions have helped to calm arms races and promote international peace. As well, lethal autonomous weapons systems threaten the protection of civilians in conflict zones. Autonomous robotic systems are most successful in predictable and reliable environments. However, autonomous weapons systems would be in highly unpredictable conflict zones and thus risk performing unreliably and endangering civilians.

NGOs remain concerned over the lack of concrete action taken on this issue. Strong and substantive statements were made by Mines Action Canada, Nobel Women’s Initiative, Human Rights Watch and others at CCW in April, urging states to establish an open-ended Group of Governmental Experts (GGE). A GGE would see in-depth study into the issues surrounding lethal autonomous weapons and submitting their findings to the UN. A GGE would be the first step in understanding issues of how lethal autonomous weapons systems would comply with international humanitarian law, the effects on regional or global security, create instability and the risk of an arms race. In addition, the GGE would give the CCW time to explore the blurred nature between soldier and weapon that fully autonomous weapon systems present. A short video compiled by the Campaign to Stop Killer Robots at the meeting in April can be seen here and daily updates can be found online for Monday, Tuesday, Wednesday, Thursday and Friday.

The meeting ended with some recommendations for the future and states ‘may decide to establish’ a GGE at the 2016 Fifth Review Conference of the High Contracting Parties to the Convention on Prohibition or Restrictions on the Use of Certain Conventional Weapons in December. In addition, the recommendations state that the GGE should work to identify characteristics of lethal autonomous weapons systems and create a working definition.

Although most interventions supported the formation of a GGE as the next step in this process, the final decision will not be made until the Review Conference in December. While Mines Action Canada is cautiously optimistic that a Group of Governmental Experts will be formalized at the meeting this December, we remain concerned that the pace of these discussions is not keeping up to the speed of technological change. Of course, the specific mandate of the GGE will be all important.

Much work remains over the coming months.

The Campaign to Stop Killer Robots released their report on the activities undertaken at the 3rd meeting of the Convention on Conventional Weapons and can be found here.

A student post by our summer student Miranda Barclay.

 

Mines Action Canada’s intervention on Article 36 weapons reviews

Today Mines Action Canada Program Coordinator made an intervention during CCW discussions about autonomous weapons systems and weapons review processes.

Thank you Madame Chair. I would like to take this opportunity to share Mines Action Canada’s observations about Article 36 reviews.

Like many others, Mines Action Canada was concerned to learn that there was so little transparency around Article 36 weapons reviews at last year’s experts meeting. The fact that so few states were willing to discuss their weapons review process is a significant impediment to the prevention of humanitarian harm caused by new weapons. Indeed it seems that too few states actually undertake these reviews in a comprehensive manner.

Last year’s revelations concerning Article 36 reviews have made it clear that international discussions on the topic are necessary. Today is a start. States need to be more transparent in their weapons review processes. Sharing criteria and standards or setting international standards will do much to shed light on the shadowy world of arms procurement. Mines Action Canada believes that Article 36 weapons reviews should be a topic of discussion at the international level to strengthen both policy and practice around the world.

However, better weapons reviews will not solve the problems associated with autonomous weapons systems for a number of reasons.

First, there is the issue of timing. A successful international process to increase the effectiveness of weapons reviews will require a significant amount of time – time we do not have in the effort to prevent the use of autonomous weapons systems because technology is developing too rapidly.

Second, weapons reviews were designed for a very different type of weapon than autonomous weapon systems which have been called the third revolution in warfare. Autonomous weapons systems will blur the line between weapon and soldier to a level that may be beyond the ability of a weapons review process. In addition, the systemic complexity that will be required to operate such a weapons system is a far cry from the more linear processes found in current weapons.

Third, Article 36 reviews are not obligated to cover weapons used for domestic purposes outside of armed conflict such as policing, border control, or crowd control. Mines Action Canada, along with many civil society organizations and states present here, have serious concerns about the possible use of autonomous weapons systems in law enforcement and uses outside of armed conflict more generally.

Fourth and most importantly, weapons reviews cannot answer the moral questions surrounding delegating the kill decision to a machine. An Article 36 review cannot tell us if it is acceptable for an algorithm to kill without meaningful human control. And that is one of the key questions we are grappling with here this week.

Article 36 weapons reviews are a legal obligation for most of the states here. It is time for a separate effort to strengthen the standards and transparency around weapons reviews. That effort must neither distract from nor overtake our work here to deal with the real moral, legal, ethical and security problems associated with autonomous weapons systems. Weapons reviews must be supplemented by new and robust international law that clearly and deliberately puts meaningful human control at the centre of all new weapons development.

The concerns raised by autonomous weapons are urgent and must take priority. In fact, a GGE next year on autonomous weapons will greatly assist future work on weapons reviews by highlighting the many challenges new technologies pose for such reviews.

Overall, there is a need for international work to improve Article 36 reviews but there is little evidence to back up the claims of some states that weapons review processes would be sufficient to ensure that autonomous weapons systems are acceptable. Article 36 reviews are only useful once questions of the moral and ethical acceptability of a weapon have been dealt with. Until that time, it would be premature to view weapons review as a panacea to our issues here at CCW.

Thank you.

Opening Statement at CCW in 2016

Our Executive Director, Paul Hannon delivered an opening statement at the CCW meeting on autonomous weapons systems today.

Thank you, Chairperson.

I appreciate the opportunity to speak on behalf of Mines Action Canada. Mines Action Canada is a Canadian disarmament organization that has been working to reduce the humanitarian impact of indiscriminate weapons for over twenty years. During this time, we have worked with partners around the world including here at the CCW to respond to the global crisis caused by landmines, cluster munitions, and other indiscriminate weapons. What makes this issue different is we have an opportunity to act now before a weapon causes a humanitarian catastrophe.

As a co-founder of the Campaign to Stop Killer Robots, Mines Action Canada’s concern with the development of autonomous weapons systems runs across the board. We have numerous legal, moral/ethical, technical, operational, political, and humanitarian concerns about autonomous weapons systems. The question of the acceptability of delegating death is not an abstract thought experiment, but is the fundamental question with policy, legal and technological implications for the real-world. We must all keep this question at the fore whenever discussing autonomous weapons systems: do you want to live in a world where algorithms or machines can make the decision to take a life? War is a human activity and removing the human component in war is dangerous for everybody. We strongly support the position of the Campaign to Stop Killer Robots that permitting machines to take a human life on the battlefield or in policing, border or crowd control, and other circumstances is unacceptable.

We have watched the development of discourse surrounding autonomous weapons systems since the beginning of the campaign. 2015 saw a dramatic expansion of the debate into different forums and segments of our global community and that expansion and the support it has generated have continued into 2016. Be it at artificial intelligence conferences, the World Economic Forum, the Halifax Security Forum or in the media, the call for a pre-emptive ban is reaching new audiences. The momentum towards a pre-emptive ban on autonomous weapons systems is clearly growing.

Mines Action Canada recognizes that there are considerable challenges facing the international community in navigating legal issues concerning an emerging technology. The desire to not hinder research and development into potentially beneficial technologies is understandable, but a pre-emptive ban on autonomous weapons systems will not limit beneficial research. As a senior executive from a robotics company representative told us at a workshop on autonomous weapons last week, there are no other applications for an autonomous system which can make a “kill or not kill” decision. The function providing an autonomous weapon the ability to make the “kill decision” and implement it does not have an equivalent civilian use. A pre-emptive ban would have no impact on the funding of research and development for artificial intelligence nor robotics.

On the other hand there are numerous other applications that would benefit society by improving other aspects of robot weapons while maintaining meaningful human control over the decision to cause harm. Communications technology, encryption, virtual reality, sensor technology – all have much broader and beneficial applications, from search and rescue by first responders to watching a school play when you can’t be there in person. None of that research and development would be hindered by a pre-emptive ban on autonomous weapons systems. A pre-emptive ban would though allow governments, private sector and academics to direct investments towards technologies which can have as much future benefit to non-military uses as possible.

While the “kill decision” function is only necessary for one application of robotic technology, predictability is an important requirement for all robots regardless of the context in which they are used. Manufacturing robots work well because they work in a predictable space. Driverless cars will also work in a predictable space though much less predictable than a factory, which is one of the reasons they require so much more testing and time to develop. Robotic weapons will be required to work in the least predictable of spaces, that is in combat and, therefore, are much more prone to failure. Commanders on the other hand need weapons they can rely on. Civilians need and have a right to expect that every effort is taken to protect them from the harmful effects of conflict.

Mines Action Canada appreciates the significant number of expert presentations scheduled for this week but we hope that states will take time to share their views throughout the week. It is time for states to begin to talk about their concerns, their positions and their policies. For this reason, we are calling on the High Contracting Parties to take the next step later this year at the Review Conference and mandate a GGE with a mandate to negotiate a new protocol on autonomous weapons.

We note that in the last 20 years three new legal instruments have entered into force. Each bans a weapon system and each was covered by the general rules of International Humanitarian Law at the time, but the international community felt that new specific laws banning these weapons was warranted. This not only strengthened the protection of civilians, but also made IHL more robust.

Autonomous weapons systems are not your average new weapon; they have the potential to fundamentally alter the nature of conflict. As a “game-changer” autonomous weapons systems deserve a serious and in-depth discussion. That discussion should also happen at the national level. Mines Action Canada hopes that our country will begin that effort this spring through the recently announced defence review and that other states will follow suit with their own national discussions.

At the core of this work is a desire to protect civilians and limit the humanitarian harm caused by armed conflict. We urge states not to lose sight of the end goal and their motivations as they complete the difficult work necessary for a robust and effective pre-emptive ban.
Thank you.

CCW – What happened last year?

With the third and hopefully final Convention on Conventional Weapons (CCW) informal experts meeting coming up in a couple days, it is important to remind ourselves of what was discussed last year and what work still needs to be done.

The gathering of the CCW member states and organisations in Geneva in April 2015 was designed as a forum at which states could discuss the important technical, legal, moral and ethical issues surrounding autonomous weapons, otherwise known as ‘killer robots’.

At the 2015 meetings, almost all states that spoke agreed that further work is necessary and desirable and many expressed that no autonomous weapons should be allowed to operate without meaningful human control. Nor with human control that is ‘devoid of meaning.’ There were however a small number of states who were more reserved regarding the eventual achievement of a pre-emptive ban on autonomous weapons. The US and Israel implied that they plan to leave the door open for the future acquisition of these weapons. While France and the UK stated that they would not pursue killer robots but still neither indicated support for the logical conclusion of a pre-emptive ban.

Another important notion that arose from the CCW 2015 meetings was the fact that autonomous weapons or killer robots are not an inevitable piece of weaponry and should never be allowed to become an inevitable piece of weaponry. This notion was a useful counterpoint to some interventions that seemed to under-estimate to value and importance of human soldiers.

Further, the CCW focused heavily on norm creation, with members emphasising the need to establish norms in order to efficiently discuss and articulate what is most disturbing and threatening about the possibility of autonomous weapons use. Once these norms are clearly established and accepted by a majority of states, hopefully there will be a more concerted effort to transform these norms into fully ratified international laws.

Finally, multiple countries and organisations identified the need to define what exactly some of the key terms commonly used at the conference meant. For example, what exactly is meant by ‘meaningful human control’? Further explorations of this principle could be a key component of a Group of Governmental Experts in 2017 leading to a process to prevent the use of fully autonomous weapons through law.

Hopefully, this year some more solid definitions can be agreed upon and a Group of Governmental Experts will be called for next year so the process of banning autonomous weapons through international law can be accelerated leading to a pre-emptive ban.

Claudia Pearson is an undergraduate student at the University of Leeds, currently studying abroad at the University of Ottawa.