Originally published on the Forum on the Arms Trade’s Looking Ahead blog, Erin Hunt looks at opportunities and challenges ahead in 2017 for efforts to preemptively ban autonomous weapons systems.
2017 has the potential to be a pivotal year in efforts to ensure that all weapons have meaningful human control. For three years, the Convention on Conventional Weapons (CCW) has been discussing lethal autonomous weapons (future weapons that could select and fire upon a target without human control). In December 2016, the Review Conference of the CCW decided to establish a Group of Governmental Experts (GGE) chaired by Ambassador Amandeep Singh Gill of India which will meet over 10 days in 2017 and then report-back to the CCW’s annual meeting on 22-24 November.
A GGE is a more formal level of meetings than the ones held in 2014, 2015 and 2016. States will be expected to bring their own experts and participate actively in discussions, instead of listening to presentations by outside experts and asking questions of those experts. The first meeting of the GGE will be held at the UN in Geneva on either 24-28 April or 21-25 August 2017. The date is dependent on when funds are available for the meeting. The second meeting of the GGE will be on 13-17 November, just before the annual CCW meeting.
In 2016, the number of states calling for a pre-emptive ban on fully autonomous weapons more than doubled. At the time of writing, Algeria, Argentina, Bolivia, Chile, Costa Rica, Cuba, Ecuador, Egypt, Ghana, Guatemala, Holy See, Mexico, Nicaragua, Pakistan, Panama, Peru, State of Palestine, Venezuela and Zimbabwe have called for a ban while a number of other states seem to support new international humanitarian law of some sort to deal with autonomous weapons systems.
This GGE is a large step towards a pre-emptive ban on autonomous weapons systems but there are a number of challenges ahead in 2017. First, the Russian Federation continues to object to more formal talks on autonomous weapon systems on the grounds that it is premature to move forward since there is not a clear understanding of the subject under discussion. That objection forgets that definitions are usually the last part of disarmament treaties to be negotiated. It was only at the very end of the 2016 CCW Review Conference that Russia agreed to not block the GGE.
Second, the majority of states, including my own, Canada, do not have national policies on autonomous weapons systems. However, this challenge is also an opportunity. The Campaign to Stop Killer Robots will be working hard around the world in 2017 to support the development of national policies on autonomous weapons systems. After three years of informal CCW experts meetings as well as discussions in the Human Rights Council, states have a large amount of information at their disposal to begin to craft national policies. States can also hold consultations on creating a national policy in advance of the GGE meetings.
Third, there is the possibility that the GGE may become distracted by the inclusion of a discussion item on best practices and greater transparency in Article 36 weapons reviews. These legal reviews are an obligation of states developing, purchasing or otherwise acquiring new weapons.
Although Article 36 weapons reviews should be a topic of discussion at the international level to strengthen both policy and practice around the world, better weapons reviews will not solve the problems associated with autonomous weapons systems and should not distract the GGE from the core of its work. Weapons reviews cannot answer moral, ethical, and political questions. An Article 36 review cannot tell us if it is acceptable to the public conscience for a machine to kill without meaningful human control. Autonomous weapons systems are often referred to as a revolution in warfare; and as such, moral, ethical and political considerations must not be pushed aside. These questions need to remain on the international agenda in 2017.
This year, we will witness significant work done at the national and international level to increase understanding of the challenges posed by autonomous weapons as well as the number of states calling for a pre-emptive ban. Stay tuned to see if the international community stands ready at year’s end to ensure that all weapons have meaningful human control.
The Convention on Conventional Weapons (CCW) Review Conference in December will decide if they will hold a Group of Governmental Experts (GGE) meeting on autonomous weapons systems in 2017. A GGE is the logical next step in the work to address concerns about autonomous weapons systems (or killer robots).
The Campaign to Stop Killer Robots is getting ready for the Review Conference here in Canada and around the world. Check out our colleagues at Reaching Critical Will for an update on the Preparatory Meeting of the CCW to see how the international preparations are going.
On the Canadian side, our Program Coordinator, Erin Hunt, was pleased to deliver the Campaign’s statement to the United Nations General Assembly’s First Committee on October 12.
Over the next month and a bit, we will be talking with parliamentarians, civil society and academics to help ensure that Canada takes a strong position at the Review Conference and beyond. You can help by writing your MP to ask that Canada outline a national policy on autonomous weapons or by donating online to support our work.
Today Mines Action Canada Program Coordinator made an intervention during CCW discussions about autonomous weapons systems and weapons review processes.
Thank you Madame Chair. I would like to take this opportunity to share Mines Action Canada’s observations about Article 36 reviews.
Like many others, Mines Action Canada was concerned to learn that there was so little transparency around Article 36 weapons reviews at last year’s experts meeting. The fact that so few states were willing to discuss their weapons review process is a significant impediment to the prevention of humanitarian harm caused by new weapons. Indeed it seems that too few states actually undertake these reviews in a comprehensive manner.
Last year’s revelations concerning Article 36 reviews have made it clear that international discussions on the topic are necessary. Today is a start. States need to be more transparent in their weapons review processes. Sharing criteria and standards or setting international standards will do much to shed light on the shadowy world of arms procurement. Mines Action Canada believes that Article 36 weapons reviews should be a topic of discussion at the international level to strengthen both policy and practice around the world.
However, better weapons reviews will not solve the problems associated with autonomous weapons systems for a number of reasons.
First, there is the issue of timing. A successful international process to increase the effectiveness of weapons reviews will require a significant amount of time – time we do not have in the effort to prevent the use of autonomous weapons systems because technology is developing too rapidly.
Second, weapons reviews were designed for a very different type of weapon than autonomous weapon systems which have been called the third revolution in warfare. Autonomous weapons systems will blur the line between weapon and soldier to a level that may be beyond the ability of a weapons review process. In addition, the systemic complexity that will be required to operate such a weapons system is a far cry from the more linear processes found in current weapons.
Third, Article 36 reviews are not obligated to cover weapons used for domestic purposes outside of armed conflict such as policing, border control, or crowd control. Mines Action Canada, along with many civil society organizations and states present here, have serious concerns about the possible use of autonomous weapons systems in law enforcement and uses outside of armed conflict more generally.
Fourth and most importantly, weapons reviews cannot answer the moral questions surrounding delegating the kill decision to a machine. An Article 36 review cannot tell us if it is acceptable for an algorithm to kill without meaningful human control. And that is one of the key questions we are grappling with here this week.
Article 36 weapons reviews are a legal obligation for most of the states here. It is time for a separate effort to strengthen the standards and transparency around weapons reviews. That effort must neither distract from nor overtake our work here to deal with the real moral, legal, ethical and security problems associated with autonomous weapons systems. Weapons reviews must be supplemented by new and robust international law that clearly and deliberately puts meaningful human control at the centre of all new weapons development.
The concerns raised by autonomous weapons are urgent and must take priority. In fact, a GGE next year on autonomous weapons will greatly assist future work on weapons reviews by highlighting the many challenges new technologies pose for such reviews.
Overall, there is a need for international work to improve Article 36 reviews but there is little evidence to back up the claims of some states that weapons review processes would be sufficient to ensure that autonomous weapons systems are acceptable. Article 36 reviews are only useful once questions of the moral and ethical acceptability of a weapon have been dealt with. Until that time, it would be premature to view weapons review as a panacea to our issues here at CCW.
Our Executive Director, Paul Hannon delivered an opening statement at the CCW meeting on autonomous weapons systems today.
Thank you, Chairperson.
I appreciate the opportunity to speak on behalf of Mines Action Canada. Mines Action Canada is a Canadian disarmament organization that has been working to reduce the humanitarian impact of indiscriminate weapons for over twenty years. During this time, we have worked with partners around the world including here at the CCW to respond to the global crisis caused by landmines, cluster munitions, and other indiscriminate weapons. What makes this issue different is we have an opportunity to act now before a weapon causes a humanitarian catastrophe.
As a co-founder of the Campaign to Stop Killer Robots, Mines Action Canada’s concern with the development of autonomous weapons systems runs across the board. We have numerous legal, moral/ethical, technical, operational, political, and humanitarian concerns about autonomous weapons systems. The question of the acceptability of delegating death is not an abstract thought experiment, but is the fundamental question with policy, legal and technological implications for the real-world. We must all keep this question at the fore whenever discussing autonomous weapons systems: do you want to live in a world where algorithms or machines can make the decision to take a life? War is a human activity and removing the human component in war is dangerous for everybody. We strongly support the position of the Campaign to Stop Killer Robots that permitting machines to take a human life on the battlefield or in policing, border or crowd control, and other circumstances is unacceptable.
We have watched the development of discourse surrounding autonomous weapons systems since the beginning of the campaign. 2015 saw a dramatic expansion of the debate into different forums and segments of our global community and that expansion and the support it has generated have continued into 2016. Be it at artificial intelligence conferences, the World Economic Forum, the Halifax Security Forum or in the media, the call for a pre-emptive ban is reaching new audiences. The momentum towards a pre-emptive ban on autonomous weapons systems is clearly growing.
Mines Action Canada recognizes that there are considerable challenges facing the international community in navigating legal issues concerning an emerging technology. The desire to not hinder research and development into potentially beneficial technologies is understandable, but a pre-emptive ban on autonomous weapons systems will not limit beneficial research. As a senior executive from a robotics company representative told us at a workshop on autonomous weapons last week, there are no other applications for an autonomous system which can make a “kill or not kill” decision. The function providing an autonomous weapon the ability to make the “kill decision” and implement it does not have an equivalent civilian use. A pre-emptive ban would have no impact on the funding of research and development for artificial intelligence nor robotics.
On the other hand there are numerous other applications that would benefit society by improving other aspects of robot weapons while maintaining meaningful human control over the decision to cause harm. Communications technology, encryption, virtual reality, sensor technology – all have much broader and beneficial applications, from search and rescue by first responders to watching a school play when you can’t be there in person. None of that research and development would be hindered by a pre-emptive ban on autonomous weapons systems. A pre-emptive ban would though allow governments, private sector and academics to direct investments towards technologies which can have as much future benefit to non-military uses as possible.
While the “kill decision” function is only necessary for one application of robotic technology, predictability is an important requirement for all robots regardless of the context in which they are used. Manufacturing robots work well because they work in a predictable space. Driverless cars will also work in a predictable space though much less predictable than a factory, which is one of the reasons they require so much more testing and time to develop. Robotic weapons will be required to work in the least predictable of spaces, that is in combat and, therefore, are much more prone to failure. Commanders on the other hand need weapons they can rely on. Civilians need and have a right to expect that every effort is taken to protect them from the harmful effects of conflict.
Mines Action Canada appreciates the significant number of expert presentations scheduled for this week but we hope that states will take time to share their views throughout the week. It is time for states to begin to talk about their concerns, their positions and their policies. For this reason, we are calling on the High Contracting Parties to take the next step later this year at the Review Conference and mandate a GGE with a mandate to negotiate a new protocol on autonomous weapons.
We note that in the last 20 years three new legal instruments have entered into force. Each bans a weapon system and each was covered by the general rules of International Humanitarian Law at the time, but the international community felt that new specific laws banning these weapons was warranted. This not only strengthened the protection of civilians, but also made IHL more robust.
Autonomous weapons systems are not your average new weapon; they have the potential to fundamentally alter the nature of conflict. As a “game-changer” autonomous weapons systems deserve a serious and in-depth discussion. That discussion should also happen at the national level. Mines Action Canada hopes that our country will begin that effort this spring through the recently announced defence review and that other states will follow suit with their own national discussions.
At the core of this work is a desire to protect civilians and limit the humanitarian harm caused by armed conflict. We urge states not to lose sight of the end goal and their motivations as they complete the difficult work necessary for a robust and effective pre-emptive ban.
We’re almost a month into 2016 and autonomous weapons systems have already been in the news thanks to a strong panel discussion at the World Economic Forum in Davos. The Campaign to Stop Killer Robots was pleased to see the panel agree that the world needs a diplomatic process to pre-emptively ban autonomous weapons systems started soon. You can read the whole analysis by the Campaign’s coordinator here.
Yes 2016 is starting on a high note for the campaign but this is not the time to be complacent. We need to keep that momentum going internationally and here in Canada. The new government has yet to share a national policy on autonomous weapons systems. Before the election, the Liberal Party of Canada wrote that:
“Emerging technologies such as Lethal Autonomous Weapon Systems pose new and serious ethical questions that must be studied and understood. The Liberal Party of Canada will work with experts and civil society to ensure that the Canadian Government develops appropriate policies to address the use and proliferation of autonomous weapon systems.”
Now that the Liberals form the government, they will have to develop “appropriate policies” soon because the international community is moving forward, albeit verrrrrry slowly. States are meeting in April 2016 for a third (and hopefully final) informal experts meeting on autonomous weapons systems under the United Nations’ Convention on Conventional Weapons and then at the end of the year, states will have the opportunity to start negotiations on a pre-emptive ban. The UN process has been called “glacial” and that it “shows no sense of urgency” but there’s time for states to pick up the pace and Canada can take a leadership role.
Canadian industry, academics and NGOs have already taken a leadership role on banning autonomous weapon systems so now it’s the government’s turn. The Canadian government and Prime Minister Trudeau made a big impression at the World Economic Forum so we hope that they will take that energy forward to act on one of newist issues discussed there. Let’s make 2016 a year of action on autonomous weapons systems.
Next week, states will decide if and how they will continue international talks on autonomous weapons systems at the UN`s Convention on Conventional Weapons in Geneva. We and the whole Campaign to Stop Killer Robots are calling on states to take the next step towards a ban by agreeing to a Group of Governmental Experts (GGE) in 2016. A GGE will allow states to explore the issues surrounding autonomous weapons systems in depth.
With such an important decision looming over states, we are launching the winners of our youth video contest. Last week, we shared the runner-up video.
Today, we are pleased to announce that Steven Hause of Florida State University won the video contest. Steven’s video covers a number of the key concerns the Campaign has about autonomous weapons systems. We hope that this video will remind governments of the need to take action at CCW next week.
Today at the UN in Geneva, states approved a new mandate for further discussions about autonomous weapons systems in 2015. To celebrate, we are pleased to share our new video on why we need to Keep Killer Robots Fiction.
Remember, students you can make your own killer robots video in our film contest.
This week states are meeting at the United Nations in Geneva to decide if discussion on lethal autonomous weapons systems will continue at the Convention on Conventional Weapons. States should continue to discuss this issue and to debate key problems with autonomous weapons systems. One of the key problems is the issue of human control. Learn more with this new video.
Last week’s meeting at the the United Nations was remarkable for a number of reasons. As discussed in an earlier post, this meeting under the Convention on Conventional Weapons was the first international discussions on autonomous weapons systems; this meeting was held less than a year and a half after the first report on the topic was released and this meeting brought together 87 states to discuss an emerging technology. The meeting was also remarkable for the shocking lack of women invited to speak.
There were 18 experts invited to give presentations to the delegates and all of them were men. Now that might sound like a story line from the final season of Mad Men, but sadly we are talking about a large diplomatic meeting hosted by the United Nations in 2014, not the exploits of Sterling, Cooper, Draper, Pryce in 1965. The Campaign to Stop Killer Robots highlighted that the provisional agenda was unbalanced and suggested numerous possible experts who are leaders in their fields and who are women. And yet the panels proceeded as planned, leaving women, as Matthew Bolton put it, “literally condemned to the margins — only allowed to speak in civil society statements from the back of the room or ‘Side Events’.”
In the opening debate, civil society representatives and Norway commented on the gender disparity and later Christof Heyns, UN Special Rapporteur on Extra-Judicial Killings, also commented on the lack of women presenting. Throughout the meeting, women contributed greatly to the discussion through side-events, statements and interventions when permitted by the meeting’s chair. Also, many of the memos and papers provided by civil society were written or co-authored by women.
Civil society including the Campaign to Stop Killer Robots has taken action to address this anachronistic situation. Sarah Knuckey began compiling a list of women working, writing and speaking on autonomous weapons – the list currently includes over 25 names and growing. Article 36, a co-founder of the Campaign to Stop Killer Robots, is compiling a list of people working in the field of peace and security – particularly disarmament, arms control and the protection of civilians – who benefit from their male gender and have committed not to speak on panels that include only men. They say:
We believe that the practice of selecting only men to speak on panels in global policymaking forums is unjust. It excludes the voices of women and other gender identities from such events, running counter to UN Security Council Resolution 1325, which commits to inclusion of women in discussions on peace and security. Global policymaking efforts on peace and security – including disarmament, arms control and the protection of civilians – must include people of a diversity of gender identities.
Mines Action Canada supports this new effort and encourages others working in this field who identify as men to join the initiative. The gender disparity at the meeting was so glaring that Motherboard covered the issue and the story was picked up by i09. As someone with a passing interest in the construction of ideas and norms, the discussion surrounding this issue on io9 is very interesting. I read the internet comments so you don’t have to and there are a few aspects of that online conversation I would like to address.
First up is the frequent comment – why does gender matter when discussing autonomous weapons? Having only men invited to speak at the UN as experts on autonomous weapons and gender considerations at the CCW matters for a number of reasons. I feel ridiculous listing reasons why women should be included in global policy making forums since it is (as stated above) 2014 not 1965 but for brevity’s sake here’s a couple of reasons unique to the autonomous weapons discussion:
- The United Nations passed Security Council Resolution 1325 in October 2000 vowing to include women in global policy making on peace and security. Resolution 1325 calls on states to “ensure increased representation of women at all decision-making levels in national, regional and international institutions and mechanisms for the prevention, management, and resolution of conflict.” Having no women presenting at a UN meeting on an emerging weapon seems pretty contrary to Resolution 1325.
- The growing consensus is autonomous weapons are a ‘game-changer’ or something that will fundamentally alter the nature of warfare globally. We need to have wide-spread discussions about the role of humanity in conflict. To only have (mostly Western middle-aged) men speak on a topic that will have a dramatic impact on lives around the world is missing a large number of voices crucial to the needed discussion.
- Proponents of autonomous weapons are saying they will be good for humanity because robots will not commit war crimes and specifically robots will not rape. Charli Carpenter has an excellent piece dismantling the “robots won’t rape” argument where she points out that that rape is not just a crime of passion by one rogue soldier or a deranged warlord but often rape and other war crimes are ordered by the state. Furthermore, the idea that rape victims and women’s bodies in general are being used for political gain in a male-dominated discussion about new weapon technology is abhorrent.
Another common line of commenting on this story was the idea that they got the best experts to present on these topics and unfortunately when it comes to things like science and engineering most of the experts are men. Well since this is not the place to discuss why there are more men than women in STEM fields, I’ll move on to the assertion that they got the best experts to present. I don’t have to say much because Sarah Knuckey’s list has made it quite clear there are a number of women who are top of their fields and “experts” on the subject matter discussed last week. But it is worth highlighting that the Harvard-based legal scholar who wrote the first report on the legal arguments surrounding autonomous weapons launching the global discussion (and who is a woman) was not included in either panel discussing legal issues. Another troubling part of this idea is the decisions over autonomy and human control in conflict should be only handled by experts in technical fields like computer science. The potential impact of autonomous weapons necessitates in-depth technical, legal, ethical and moral analysis. A perceived gender imbalance in STEM does not justify only hearing from men on all topics of discussion.
I have ignored many of the blatantly misogynistic comments on the io9 piece about the lack of women at CCW and the work of obvious trolls but there is one more theme in the comments I would like to address. More than one commenter stated something like “if they overlooked people that were more qualified to be present then it absolutely needs to be addressed [emphasis mine].” The idea that women have to be better than men before their opinion should be taken into consideration is rather insidious. It can be linked to the so-called confidence gap between men and women among other aspects of gender dynamics in the workplace. I see this idea even in my own life – just last week, I did extra reading prior to a meeting because I felt that, as a young woman, I needed to know the topic better than anyone else before they would take me seriously. One of the lessons I will take from this discussion of gender in global policy development spawned by the lack of women at the CCW meeting is that it is beyond time to ask the question why should a woman have to be more qualified rather than just as qualified as a man to be considered an expert?
Last week’s CCW meeting made much progress in the global discussion of autonomous weapons systems despite the regressive gender dynamics but we cannot continue on that path without recognizing the capabilities and expertise offered by women. We cannot continue to miss half the conversation. Civil society is taking action to improve gender representation in policy making and the media has recognized women as experts on this topic on numerous occasions so now it is up to the states. It is time for states to get serious about implementing Resolution 1325. It is time for states to hear more than half the story.
Update May 23: the International Committee for Robot Arms Control has listed their world leading female experts to prevent anyone using the excuse that there are no suitable women experts.
Last week, 87 states gathered in Geneva to discuss lethal autonomous weapons systems.
This Informal Experts Meeting ran from May 13 to May 16 and was the first international discussion on autonomous weapons systems. The meeting was focused on information rather than decision making. The 87 states attended the meeting under the Convention on Conventional Weapons (CCW) along with representatives from UN agencies including UNIDIR, the International Committee of the Red Cross (ICRC), and registered non-governmental organizations including the delegation of the Campaign to Stop Killer Robots.
The four day meeting included general debate and then substantive sessions with presentations from experts. The Chair’s summary showed that there is a willingness to pursue this topic and a possible issue for the next meetings would be the concept of meaningful human control. The options for going forward cited include exchange of information, development of best practices, moratorium on research, and a ban. The Campaign to Stop Killer Robots has a great piece about the meeting on their website.
Over the course of the week many states highlighted the importance of always maintaining meaningful human control over targeting and attack decisions. We are MAC were not only pleased that 5 countries have already called for a ban, but also that no country vigorously defended or argued for autonomous systems weapons although Czech Republic and Israel each spoke on the desirability of such systems.
Unlike most countries, Canada has not yet provided copies of their statements to Reaching Critical Will or to the United Nations so we have had to piece together the statements from the CCW Review and Twitter. On day 1, Canada was the only country to say that existing international humanitarian law is sufficient to regulate the use of autonomous weapons. It also said that the definition of autonomy is difficult as autonomy is subjective depending on the system. On day 2, Canada said that the moral aspects of autonomous weapons are important and must be part of discussions in CCW. It looks like Canada did not make any statements or interventions on Day 3. On day 4, Canada called for more discussion on the ethical and political issues including meaningful human control under the CCW. Canada also said humanitarian and state security concerns must be balanced in considering autonomous weapons – which is language usually heard from Russia, China and similar states.
Some of the presentations from the substantive sessions are available online:
Technological Issues – key topics included definitions of autonomy and meaningful human control. Included a debate between Ron Arkin who believes that it is pre-mature to ban autonomous weapons and Noel Sharkey who does not believe that computerised weapons without a human in control can fully comply with international humanitarian law in the foreseeable future.
Ethics and Sociology – key topics included if machines should make the decision to take a human life, the relevance of human judgement to international law and the need for human control.
Legal Issues (International Humanitarian Law) – key topics included definitions, whether or not autonomous weapons systems are inherently illegal, morality and military effectiveness. This was an extensive debate.
Legal Issues (other areas of international law) – key topics included human rights law, accountability and article 36 weapons reviews.
Operational and military issues – key topics included meaningful human control, military effectiveness and the nature of warfare.
The Campaign to Stop Killer Robots held side events each day to delve deeper into the issues at hand. These side events were well attended and lively discussions covered the topics at hand in greater depth.
While the meetings were progressing in Geneva here at the national level Mines Action Canada was working to ensure these historic sessions reached media coverage across Canada. For example:
- Paul Hannon was on Calgary’s News Talk 770 and News Talk 610 in St. Catherines.
- Erin Hunt was on Kevin Newman Live (starts 2:40 mark) and CFAX 1070 in Victoria (starts 6:07 mark).
- Dr. Ian Kerr was on Ontario Today – you should definitely check out the call of the day.
- Prof. Noel Sharkey was on CBC’s As It Happens (starts at 9:40 mark)
- The Globe and Mail, the Weather Network, Global News, CTV News, Ottawa Citizen and Metro also covered the issue while the Ottawa Citizen Defense Blog picked up our press release.
CCW member states will reconvene in November to decide if they want to continue these talks. Until then Mines Action Canada and our colleagues in the international campaign will continue to push for a renewed and expanded mandate including continued discussions on meaningful human control over all targeting and firing decisions.