There were 18 experts invited to give presentations to the delegates and all of them were men. Now that might sound like a story line from the final season of Mad Men, but sadly we are talking about a large diplomatic meeting hosted by the United Nations in 2014, not the exploits of Sterling, Cooper, Draper, Pryce in 1965. The Campaign to Stop Killer Robots highlighted that the provisional agenda was unbalanced and suggested numerous possible experts who are leaders in their fields and who are women. And yet the panels proceeded as planned, leaving women, as Matthew Bolton put it, “literally condemned to the margins — only allowed to speak in civil society statements from the back of the room or ‘Side Events’.”
In the opening debate, civil society representatives and Norway commented on the gender disparity and later Christof Heyns, UN Special Rapporteur on Extra-Judicial Killings, also commented on the lack of women presenting. Throughout the meeting, women contributed greatly to the discussion through side-events, statements and interventions when permitted by the meeting’s chair. Also, many of the memos and papers provided by civil society were written or co-authored by women.
Civil society including the Campaign to Stop Killer Robots has taken action to address this anachronistic situation. Sarah Knuckey began compiling a list of women working, writing and speaking on autonomous weapons – the list currently includes over 25 names and growing. Article 36, a co-founder of the Campaign to Stop Killer Robots, is compiling a list of people working in the field of peace and security – particularly disarmament, arms control and the protection of civilians – who benefit from their male gender and have committed not to speak on panels that include only men. They say:
We believe that the practice of selecting only men to speak on panels in global policymaking forums is unjust. It excludes the voices of women and other gender identities from such events, running counter to UN Security Council Resolution 1325, which commits to inclusion of women in discussions on peace and security. Global policymaking efforts on peace and security – including disarmament, arms control and the protection of civilians – must include people of a diversity of gender identities.
Mines Action Canada supports this new effort and encourages others working in this field who identify as men to join the initiative. The gender disparity at the meeting was so glaring that Motherboard covered the issue and the story was picked up by i09. As someone with a passing interest in the construction of ideas and norms, the discussion surrounding this issue on io9 is very interesting. I read the internet comments so you don’t have to and there are a few aspects of that online conversation I would like to address.
First up is the frequent comment – why does gender matter when discussing autonomous weapons? Having only men invited to speak at the UN as experts on autonomous weapons and gender considerations at the CCW matters for a number of reasons. I feel ridiculous listing reasons why women should be included in global policy making forums since it is (as stated above) 2014 not 1965 but for brevity’s sake here’s a couple of reasons unique to the autonomous weapons discussion:
Another common line of commenting on this story was the idea that they got the best experts to present on these topics and unfortunately when it comes to things like science and engineering most of the experts are men. Well since this is not the place to discuss why there are more men than women in STEM fields, I’ll move on to the assertion that they got the best experts to present. I don’t have to say much because Sarah Knuckey’s list has made it quite clear there are a number of women who are top of their fields and “experts” on the subject matter discussed last week. But it is worth highlighting that the Harvard-based legal scholar who wrote the first report on the legal arguments surrounding autonomous weapons launching the global discussion (and who is a woman) was not included in either panel discussing legal issues. Another troubling part of this idea is the decisions over autonomy and human control in conflict should be only handled by experts in technical fields like computer science. The potential impact of autonomous weapons necessitates in-depth technical, legal, ethical and moral analysis. A perceived gender imbalance in STEM does not justify only hearing from men on all topics of discussion.
I have ignored many of the blatantly misogynistic comments on the io9 piece about the lack of women at CCW and the work of obvious trolls but there is one more theme in the comments I would like to address. More than one commenter stated something like “if they overlooked people that were more qualified to be present then it absolutely needs to be addressed [emphasis mine].” The idea that women have to be better than men before their opinion should be taken into consideration is rather insidious. It can be linked to the so-called confidence gap between men and women among other aspects of gender dynamics in the workplace. I see this idea even in my own life – just last week, I did extra reading prior to a meeting because I felt that, as a young woman, I needed to know the topic better than anyone else before they would take me seriously. One of the lessons I will take from this discussion of gender in global policy development spawned by the lack of women at the CCW meeting is that it is beyond time to ask the question why should a woman have to be more qualified rather than just as qualified as a man to be considered an expert?
Last week’s CCW meeting made much progress in the global discussion of autonomous weapons systems despite the regressive gender dynamics but we cannot continue on that path without recognizing the capabilities and expertise offered by women. We cannot continue to miss half the conversation. Civil society is taking action to improve gender representation in policy making and the media has recognized women as experts on this topic on numerous occasions so now it is up to the states. It is time for states to get serious about implementing Resolution 1325. It is time for states to hear more than half the story.
Update May 23: the International Committee for Robot Arms Control has listed their world leading female experts to prevent anyone using the excuse that there are no suitable women experts.
The two days started with an op-ed by Ian Kerr who holds the Canada Research Chair in Ethics, Law and Technology at the University of Ottawa and is a member of ICRAC.
On April 28th, we met with other peace, disarmament and development organizations to talk about the campaign and to begin to build a stronger civil society presence in Canada on this issue. There was a lot of a interest from our non-profit colleagues so we look forward to hearing more voices on this issue in the near future.
Later that day, we hosted a public event at Ottawa City Hall. There was a panel discussion with Peter, Paul, Mary and Ian followed by a rather lively Question and Answer session with the audience. The audience was generally quite supportive of the Campaign and our efforts to achieve a pre-emptive ban on autonomous weapons. Audience members with backgrounds in engineering, law, the military and politics all expressed concern about the development of killer robots.
The following morning, MAC hosted a breakfast briefing for parliamentarians and their staff, other NGOs and decision makers in Ottawa. The Bagels and ‘Bots breakfast was the first time some of these decision makers had heard of the issue and it seemed to strike a chord with many in the room. After breakfast, the team was off to Parliament Hill for a press conference. At the press conference and in MAC’s press release, campaigners called for Canadian leadership on this issue internationally and for Canada to be the first country in the world to declare a moratorium on the development and production of killer robots.
The media in Ottawa and across the country have taken quite an interest in these events. The Canadian Press story was picked up in newspapers across the country as well as national media outlets and there was an associated list of facts about killer robots. The Sun News Network, and Ottawa Citizen also covered the Campaign while MAC has received a number of radio interview requests. Paul Hannon, Executive Director, was on CKNW Morning News with Philip Till.
One very exciting result of these activities is that The Globe and Mail’s editorial team has come out in support of the Campaign to Stop Killer Robots and our call:
The world has a long banned some weapons deemed dangerous, indiscriminate or inhumane, including chemical weapons and land mines. Autonomous robot weapons carry all such risks, and add new ones to the list. They are not wielded remotely by humans, but are intended to operate without supervision. They’re about turning life and death decisions over to software. Canada should be a leading voice advocating for a global protocol limiting their development and use.
Also Jian Ghomenshi on CBC Radio’s Q called for Canadian leadership on killer robots, he says that leadership on this issue is something Canadians could be proud of and that it could be a legacy issue for Prime Minister Stephen Harper.
The Keep Killer Robots Fiction initiative is off to a great start. You can get involved by signing and sharing the petition at: /KRpetition.
In November 2013,the World Council of Churches made a statement that recommends governments to: “Declare their support for a pre-emptive ban on drones and other robotic weapons systems that will select and strike targets without human intervention when operating in fully autonomous mode;”.
Building on that recommendation, our colleagues in the Netherlands have launched an Interfaith Declaration that says:
we, as religious leaders, faith groups and faith-based organizations, raise our collective voice to
call on all governments to participate in the international debate on the issue, and to work
towards a ban on the development, production and use of fully autonomous weapons.
The team at PAX put together a Factsheet on the Interfaith Declaration and you can find even more information on their website.
We’re calling on all Canadian religious leaders, faith based organization and faith groups to support a ban on autonomous weapons and to sign the Interfaith Declaration. Here is the full text of the Declaration: Interfaith Declaration.pdf (EN) and Interfaith Declaration FR.pdf (FR). To sign the declaation digitally visit /stay-informed/news/interfaith-declaration or you can contact PAX directly at [email protected]. In addition to the Interfaith Declaration for religious leaders and faith groups, individuals can sign Mines Action Canada’s Keep Killer Robots Fiction petition.
The Campaign to Stop Killer Robots was launched in April 2013 in London. Mines Action Canada is a co-founder of the campaign and a member of its Steering Committee along with other disarmament, human rights and humanitarian organizations.
In May, the first Human Rights Council debate on lethal autonomous robotics followed the presentation of a report by the UN special rapporteur, Christof Heyns, on extra-judicial killings. During the debate 20 governments make their views known for the first time.
A University of Massachusetts survey of 1,000 Americans found a majority oppose fully autonomous weapons and support actions to campaign against them. In August, the International Committee of the Red Cross issued a “new technologies” edition of its quarterly journal. The journal included articles by campaigners on fully autonomous weapons.
During the UN General Assembly First Committee on Disarmament and International Security in New York in October, 16 governments made statements on killer robots. Also in October, campaign member the International Committee for Robot Arms Control launched a letter from over 250 roboticists, scientists and other experts calling for a ban on autonomous weapons.
In November at the Convention on Conventional Weapons (CCW) in Geneva, 35 nations express their views on lethal autonomous weapons systems. States parties to the Convention on Conventional Weapons agreed to a mandate to begin work in 2014 on the emerging technology of “lethal autonomous weapons systems.”
Mines Action Canada (MAC) welcomed this historic decision to begin to address this issue. MAC encouraged all states to pursue an international ban on these weapons to ensure there will always be meaningful human control over targeting decisions and the use of violent force. We were also pleased that Canada made its first public statements on this topic during the CCW joining the other 43 nations who have spoken out on fully autonomous weapons since May. “ If we have learned anything from the Canadian led efforts to ban landmines, it is that the world cannot afford to wait until there is a humanitarian crisis to act. We need a pre-emptive ban on fully autonomous weapons before they can cause a humanitarian disaster,” said Paul Hannon, Executive Director, Mines Action Canada in a press release.
Our colleagues around the world have also seen exciting developments in their countries. The international campaign has put together a global recap.
Canada does not have a national policy on autonomous weapons. There are many reasons why Canada needs to have a policy on killer robots as soon as possible. This year, MAC looks forward to working with the Government of Canada to develop a national policy and to work towards an international treaty banning killer robots.
You can take action in 2014 by signing our Keep Killer Robots Fiction petition, by sharing the campaign website www.stopkillerrobots.ca and by donating to this new campaign.
There have been some exciting and important developments over the summer. The International Committee of the Red Cross (ICRC) launched the newest edition of the International Review of the Red Cross and the theme is New Technologies and Warfare. A number of campaigners contributed to the journal so it is definitely worth a read. The ICRC also published a Frequently Asked Questions document on autonomous weapons that helps explain the issue and the ICRC’s position on fully autonomous weapons.
France along with the United Nations Office for Disarmament Affairs in Geneva convened a seminar on fully autonomous weapons for governments and civil society in early September. The Campaign to Stop Killer Robots had campaigners taking part and you can read the full report on the global campaign’s website.
The campaigns in Germany and Norway are starting off strong as well. In the lead up to the German election, all the major parties shared their policy positions in regards to fully autonomous weapons with our colleagues at Facing Finance. Norwegian campaigners launched their campaign with a breakfast seminar and now they are waiting to hear what the new Norwegian government’s policy on fully autonomous weapons will be.
Like our colleagues in Norway, we’re still waiting to hear what Canada’s policy on fully autonomous weapons will be. We have written to the Ministers of National Defense and of Foreign Affairs but the campaign team has not yet heard back. In the meantime, Canadians can weigh in on the topic through our new online petition. Share and sign the petition today! This petition is the first part of a new initiative that will be coming your way in a few weeks. Keep your eye out for the news and until then keep sharing the petition so that the government knows that Canadians have concerns about fully autonomous weapons and believe that Canada should have a strong policy against them.
EDIT: We had a very human moment here and forgot to include congratulations to James Foy of Vancouver for winning the 2013 Canadian Bar Association’s National Military Law Section Law School Sword and Scale Essay Prize for his essay called Autonomous Weapons Systems: Taking the Human out of International Humanitarian Law. It is great to see law students looking at this new topic and also wonderful that the Canadian Bar Association recognized the importance of this issue. Congratulations James!
Our colleagues at Article 36 have done a detailed analysis of the debate. In light of the stronger language in this debate, there is some room to optimistic
It would seem straightforward to move from such a strong national position to a formalised national moratorium and a leading role within an international process to prohibit such weapons. The government did not provide any reason as to why a moratorium would be inappropriate, other than to speculate on the level of support amongst other countries for such a course of action.
Whilst significant issues still require more detailed elaboration, Article 36 believes this parliamentary debate has been very valuable in prompting reflection and Ministerial scrutiny of UK policy on fully autonomous weapons and narrowing down the areas on which further discussions should focus. It appears clear now that there will be scope for such discussions to take place with the UK and other states in the near future.
The UK parliamentary debate and Article 36’s analysis of it, coming so soon after the Human Rights Council debate and the widespread media coverage of the issue make it quite clear that it is time to have such a substantive and non-partisan debate in the Canadian House of Commons as the government works out its policy on this important issue.
Now unless like me you grew up with a sci-fi geek for a father who introduced you to various fictional worlds like those in Star Wars, Star Trek and 2001: A Space Odyssey at a young age, you might not know who Isaac Asimov is, what his Three Laws of Robotics are and why these laws are relevant to the Campaign to Stop Killer Robots.
Isaac Asimov (1920-1992) was an American scientist and writer, best known for his science fiction writings especially short stories. In his writings, Asimov created the Three Laws of Robotics which govern the action of his robot characters. In his stories, the Three Laws were programmed into robots as a safety function. The laws were first stated in the short story Runaround but you can see them in many of his other writings and since then they have shown up in other authors’ work as well.
The Three Laws of Robotics are:
After reading the Three Laws, it might be pretty clear why Mr. Asimov’s ideas are frequently mentioned in media coverage of our campaign to stop fully autonomous weapons. A fully autonomous weapon will most definitely violate the first and second laws of robotics.
To me, the Three Laws seem to be pretty common sense guides for the actions of autonomous robots. It is probably a good idea to protect yourself from being killed by your own machine – ok not probably – it is a good idea to make sure your machine does not kill you! It also is important for us to remember that Asimov recognized that just regular robots with artificial intelligence (not even fully autonomous weapons) could pose a threat to humanity at large so he also added a fourth, or zeroth law, to come before the others:
0. A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
“But Erin,” you say, “these are just fictional stories; the Campaign to Stop Killer Robots is dealing with how things really will be. We need to focus on reality not fiction!” I hear you but since fully autonomous weapons do not yet exist we need to take what we know about robotics, warfare and law and add a little imagination to foresee some of the possible problems with fully autonomous weapons. Who better to help us consider the possibilities than science fiction writers who have been thinking about these types of issues for decades?
At the moment, Asimov’s Three Laws are currently the closest thing we have to laws explicitly governing the use of fully autonomous weapons. Asimov’s stories often tell tales of how the application of these laws result in robots acting in weird and dangerous ways the programmers did not predict. By articulating some pretty common sense laws for robots and then showing how those laws can have unintended negative consequences when implemented by artificial intelligence, Asimov’s writings may have made the first argument that a set of parameters to guide the actions of fully autonomous weapons will not be sufficient. Even if you did not have a geeky childhood like I did, you can still see the problems with creating fully autonomous weapons. You don’t have to read Asimov, know who HAL is or have a disliking for the Borg to worry that we won’t be able to control how artificial intelligence will interpret our commands and anyone who has tried to use a computer, a printer or a cell phone knows that there is no end to the number of ways technology can go wrong. We need a pre-emptive ban on fully autonomous weapons before it is too late and that is what the Campaign to Stop Killer Robots will be telling the diplomats at the UN in Geneva at the end of the month.
- Erin Hunt, Program Officer
All the discussions we’ve been having since the launch of the Campaign to Stop Killer Robots make me think about Alice in Wonderland and therefore I’ve been thinking a lot about rabbit holes. I feel like current technology has us poised at the edge of a rabbit hole and if we take that extra step and create fully autonomous weapons we are going to fall – down that rabbit hole into the unknown, down into a future where a machine could make the decision to kill you, down into a situation that science fiction books have been warning us about for decades.
The best way to prevent such a horrific fall is going to be to create laws and policies that will block off the entrance to the rabbit hole so to speak. At the moment, not many countries have policies to temporarily block the entrance and no one has laws to ban killer robots and close off the rabbit hole permanently. It is really only the US and the UK who have even put up warning signs and a little bit of chicken wire around the entrance to this rabbit hole of killer robots through recently released policies and statements.
Over the past few weeks our colleagues at Human Rights Watch (HRW) and Article 36 have released reports on the US and UK policies towards fully autonomous weapons (killer robots). HRW analyzed the 2012 US policy on autonomous weapons found in Department of Defense Directive Number 3000.09. You can find the full review online. Article 36 has a lot to say about the UK policy in their paper available online as well.
So naturally after reading these papers, I went in search of Canada’s policy. That search left me feeling a little like Alice lost in Wonderland just trying to keep my head or at least my sanity in the face of a policy that like the Cheshire Cat might not be all there.
After my futile search, it became even more important that we talk to the government to find out if Canada has a policy on fully autonomous weapons. Until those conversations happen, let’s see what we can learn from the US and UK policies and the analysis done by HRW and Article 36.
The US Policy
I like that the US Directive notes the risks to civilians including “unintended engagements” and failure. One key point that Human Rights Watch’s analysis highlights is that the Directive states that for up to 10 years the US Department of Defense can only develop and use fully autonomous weapons that have non-lethal force. The moratorium on lethal fully autonomous weapons is a good start but there are also some serious concerns about the inclusion of waivers that could override the moratorium. HRW believes that “[t]hese loopholes open the door to the development and use of fully autonomous weapons that could apply lethal force and thus have the potential to endanger civilians in armed conflict.”[1]
In summary Human Rights Watch believes that:
The Department of Defense Directive on autonomy in weapon systems has several positive elements that could have humanitarian benefits. It establishes that fully autonomous weapons are an important and pressing issue deserving of serious concern by the United States as well as other nations. It makes clear that fully autonomous weapons could pose grave dangers and are in need of restrictions or prohibitions. It is only valid for a limited time period of five to ten years, however, and contains a number of provisions that could weaken its intended effect considerably. The Directive’s restrictions regarding development and use can be waived under certain circumstances. In addition, the Directive highlights the challenges of designing adequate testing and technology, is subject to certain ambiguity, opens the door to proliferation, and applies only to the Department of Defense.[2]
In terms of what this all means for us in Canada, we can see there may be some aspects of the American policy that are worth adopting. The restrictions on the use of lethal force by fully autonomous weapons should be adopted by Canada to protect civilians from harm without the limited time period and waivers. I believe that Canadians would want to ensure that humans always make the final decision about who lives and who dies in combat.
The UK Policy
Now our friends at Article 36 have pointed out the UK situation is a little more convoluted – and they are not quite ready to call it a comprehensive policy but since “the UK assortment of policy-type statements” sounds ridiculous, for the purposes of this post I’m shortening it to the UK almost-policy with the hope that one day it will morph into a full policy. Unlike the US policy which is found in a neat little directive, the UK almost-policy is cobbled together from some statements and a note from the Ministry of Defense. You have a closer look at the Article 36 analysis of the almost-policy.
To sum up Article 36 outlines three main shortcomings of the UK almost-policy:
One of the most interesting points that Article 36 makes is the need for a definition of what human control over weapons systems means. If you are like me, you probably think that would be that humans get to make the decision to fire on a target making the final decision of who lives or who dies but we need to know exactly what governments mean when they say that humans will always been in control. The Campaign to Stop Killer Robots wants to ensure that there is always meaningful human control over lethal weapons systems.
Defining what we mean by meaningful human control is going to be a very large discussion that we want to have with governments, with civil society, with the military, with roboticists and with everyone else. This discussion will raise some very interesting moral and ethical questions especially since a two-star American general recently said that he thought it was “the ultimate human indignity to have a machine decide to kill you.” The problem is once that technology exists it is going to be incredibly difficult to know where that is going to go and how on earth we are going to get back up that rabbit hole. For us as Canadians it is key to start having that conversation as soon as possible so we don’t end up stumbling down the rabbit hole of fully autonomous weapons by accident.
- Erin Hunt, Program Officer