As a co-founder and the Canadian representatives of the Campaign to Stop Killer Robots, Mines Action Canada welcomes Clearpath Robotics’ decision and applauds their staff for their thoughtful and courageous stance on this issue. “Clearpath Robotics has set the ethical standard for robotics companies around the world. Their pledge to not manufacture autonomous weapons systems demonstrates clearly that research and development into autonomous robots and military robots does not require the creation of ‘killer robots’ and that there are many applications of autonomous robotics that can benefit humanity,” said Paul Hannon, Executive Director, Mines Action Canada. “As Canadian, I am proud that a Canadian company was the first in the world to pledge to not manufacture killer robots.”
As the international community is scheduled to discuss autonomous weapons systems at the United Nations again this fall, Mines Action Canada strongly supports Clearpath Robotics’ pledge and we join them in encouraging “those who might see business opportunities in this technology to seek other ways to apply their skills and resources for the betterment of humankind.” We look forward to similar statements from other robotics companies in Canada and around the world. Members of the public who share Clearpath Robotics’ views can sign the Keep Killer Robots Fiction petition at /KRpetition while individual roboticists and scientists can join the International Committee for Robot Arms Control’s Scientists’ Call online at: http://icrac.net/call/.
Learn more about Clearpath Robotics on Twitter or Facebook.
There were 18 experts invited to give presentations to the delegates and all of them were men. Now that might sound like a story line from the final season of Mad Men, but sadly we are talking about a large diplomatic meeting hosted by the United Nations in 2014, not the exploits of Sterling, Cooper, Draper, Pryce in 1965. The Campaign to Stop Killer Robots highlighted that the provisional agenda was unbalanced and suggested numerous possible experts who are leaders in their fields and who are women. And yet the panels proceeded as planned, leaving women, as Matthew Bolton put it, “literally condemned to the margins — only allowed to speak in civil society statements from the back of the room or ‘Side Events’.”
In the opening debate, civil society representatives and Norway commented on the gender disparity and later Christof Heyns, UN Special Rapporteur on Extra-Judicial Killings, also commented on the lack of women presenting. Throughout the meeting, women contributed greatly to the discussion through side-events, statements and interventions when permitted by the meeting’s chair. Also, many of the memos and papers provided by civil society were written or co-authored by women.
Civil society including the Campaign to Stop Killer Robots has taken action to address this anachronistic situation. Sarah Knuckey began compiling a list of women working, writing and speaking on autonomous weapons – the list currently includes over 25 names and growing. Article 36, a co-founder of the Campaign to Stop Killer Robots, is compiling a list of people working in the field of peace and security – particularly disarmament, arms control and the protection of civilians – who benefit from their male gender and have committed not to speak on panels that include only men. They say:
We believe that the practice of selecting only men to speak on panels in global policymaking forums is unjust. It excludes the voices of women and other gender identities from such events, running counter to UN Security Council Resolution 1325, which commits to inclusion of women in discussions on peace and security. Global policymaking efforts on peace and security – including disarmament, arms control and the protection of civilians – must include people of a diversity of gender identities.
Mines Action Canada supports this new effort and encourages others working in this field who identify as men to join the initiative. The gender disparity at the meeting was so glaring that Motherboard covered the issue and the story was picked up by i09. As someone with a passing interest in the construction of ideas and norms, the discussion surrounding this issue on io9 is very interesting. I read the internet comments so you don’t have to and there are a few aspects of that online conversation I would like to address.
First up is the frequent comment – why does gender matter when discussing autonomous weapons? Having only men invited to speak at the UN as experts on autonomous weapons and gender considerations at the CCW matters for a number of reasons. I feel ridiculous listing reasons why women should be included in global policy making forums since it is (as stated above) 2014 not 1965 but for brevity’s sake here’s a couple of reasons unique to the autonomous weapons discussion:
Another common line of commenting on this story was the idea that they got the best experts to present on these topics and unfortunately when it comes to things like science and engineering most of the experts are men. Well since this is not the place to discuss why there are more men than women in STEM fields, I’ll move on to the assertion that they got the best experts to present. I don’t have to say much because Sarah Knuckey’s list has made it quite clear there are a number of women who are top of their fields and “experts” on the subject matter discussed last week. But it is worth highlighting that the Harvard-based legal scholar who wrote the first report on the legal arguments surrounding autonomous weapons launching the global discussion (and who is a woman) was not included in either panel discussing legal issues. Another troubling part of this idea is the decisions over autonomy and human control in conflict should be only handled by experts in technical fields like computer science. The potential impact of autonomous weapons necessitates in-depth technical, legal, ethical and moral analysis. A perceived gender imbalance in STEM does not justify only hearing from men on all topics of discussion.
I have ignored many of the blatantly misogynistic comments on the io9 piece about the lack of women at CCW and the work of obvious trolls but there is one more theme in the comments I would like to address. More than one commenter stated something like “if they overlooked people that were more qualified to be present then it absolutely needs to be addressed [emphasis mine].” The idea that women have to be better than men before their opinion should be taken into consideration is rather insidious. It can be linked to the so-called confidence gap between men and women among other aspects of gender dynamics in the workplace. I see this idea even in my own life – just last week, I did extra reading prior to a meeting because I felt that, as a young woman, I needed to know the topic better than anyone else before they would take me seriously. One of the lessons I will take from this discussion of gender in global policy development spawned by the lack of women at the CCW meeting is that it is beyond time to ask the question why should a woman have to be more qualified rather than just as qualified as a man to be considered an expert?
Last week’s CCW meeting made much progress in the global discussion of autonomous weapons systems despite the regressive gender dynamics but we cannot continue on that path without recognizing the capabilities and expertise offered by women. We cannot continue to miss half the conversation. Civil society is taking action to improve gender representation in policy making and the media has recognized women as experts on this topic on numerous occasions so now it is up to the states. It is time for states to get serious about implementing Resolution 1325. It is time for states to hear more than half the story.
Update May 23: the International Committee for Robot Arms Control has listed their world leading female experts to prevent anyone using the excuse that there are no suitable women experts.
Last week, 87 states gathered in Geneva to discuss lethal autonomous weapons systems.
This Informal Experts Meeting ran from May 13 to May 16 and was the first international discussion on autonomous weapons systems. The meeting was focused on information rather than decision making. The 87 states attended the meeting under the Convention on Conventional Weapons (CCW) along with representatives from UN agencies including UNIDIR, the International Committee of the Red Cross (ICRC), and registered non-governmental organizations including the delegation of the Campaign to Stop Killer Robots.
The four day meeting included general debate and then substantive sessions with presentations from experts. The Chair’s summary showed that there is a willingness to pursue this topic and a possible issue for the next meetings would be the concept of meaningful human control. The options for going forward cited include exchange of information, development of best practices, moratorium on research, and a ban. The Campaign to Stop Killer Robots has a great piece about the meeting on their website.
Over the course of the week many states highlighted the importance of always maintaining meaningful human control over targeting and attack decisions. We are MAC were not only pleased that 5 countries have already called for a ban, but also that no country vigorously defended or argued for autonomous systems weapons although Czech Republic and Israel each spoke on the desirability of such systems.
Unlike most countries, Canada has not yet provided copies of their statements to Reaching Critical Will or to the United Nations so we have had to piece together the statements from the CCW Review and Twitter. On day 1, Canada was the only country to say that existing international humanitarian law is sufficient to regulate the use of autonomous weapons. It also said that the definition of autonomy is difficult as autonomy is subjective depending on the system. On day 2, Canada said that the moral aspects of autonomous weapons are important and must be part of discussions in CCW. It looks like Canada did not make any statements or interventions on Day 3. On day 4, Canada called for more discussion on the ethical and political issues including meaningful human control under the CCW. Canada also said humanitarian and state security concerns must be balanced in considering autonomous weapons – which is language usually heard from Russia, China and similar states.
Some of the presentations from the substantive sessions are available online:
Technological Issues – key topics included definitions of autonomy and meaningful human control. Included a debate between Ron Arkin who believes that it is pre-mature to ban autonomous weapons and Noel Sharkey who does not believe that computerised weapons without a human in control can fully comply with international humanitarian law in the foreseeable future.
Ethics and Sociology – key topics included if machines should make the decision to take a human life, the relevance of human judgement to international law and the need for human control.
Legal Issues (International Humanitarian Law) – key topics included definitions, whether or not autonomous weapons systems are inherently illegal, morality and military effectiveness. This was an extensive debate.
Legal Issues (other areas of international law) – key topics included human rights law, accountability and article 36 weapons reviews.
Operational and military issues – key topics included meaningful human control, military effectiveness and the nature of warfare.
The Campaign to Stop Killer Robots held side events each day to delve deeper into the issues at hand. These side events were well attended and lively discussions covered the topics at hand in greater depth.
While the meetings were progressing in Geneva here at the national level Mines Action Canada was working to ensure these historic sessions reached media coverage across Canada. For example:
CCW member states will reconvene in November to decide if they want to continue these talks. Until then Mines Action Canada and our colleagues in the international campaign will continue to push for a renewed and expanded mandate including continued discussions on meaningful human control over all targeting and firing decisions.
The memo shares lessons learnt from the process that resulted in Protocol IV on Blinding Laser Weapons which was a pre-emptive ban on a weapon due to humanitarian concerns. Protocol IV shows that pre-emptive bans (like the one called for by the Campaign to Stop Killer Robots) are possible under the Convention on Conventional Weapons. Download the Protocol IV Memo now.
The two days started with an op-ed by Ian Kerr who holds the Canada Research Chair in Ethics, Law and Technology at the University of Ottawa and is a member of ICRAC.
On April 28th, we met with other peace, disarmament and development organizations to talk about the campaign and to begin to build a stronger civil society presence in Canada on this issue. There was a lot of a interest from our non-profit colleagues so we look forward to hearing more voices on this issue in the near future.
Later that day, we hosted a public event at Ottawa City Hall. There was a panel discussion with Peter, Paul, Mary and Ian followed by a rather lively Question and Answer session with the audience. The audience was generally quite supportive of the Campaign and our efforts to achieve a pre-emptive ban on autonomous weapons. Audience members with backgrounds in engineering, law, the military and politics all expressed concern about the development of killer robots.
The following morning, MAC hosted a breakfast briefing for parliamentarians and their staff, other NGOs and decision makers in Ottawa. The Bagels and ‘Bots breakfast was the first time some of these decision makers had heard of the issue and it seemed to strike a chord with many in the room. After breakfast, the team was off to Parliament Hill for a press conference. At the press conference and in MAC’s press release, campaigners called for Canadian leadership on this issue internationally and for Canada to be the first country in the world to declare a moratorium on the development and production of killer robots.
The media in Ottawa and across the country have taken quite an interest in these events. The Canadian Press story was picked up in newspapers across the country as well as national media outlets and there was an associated list of facts about killer robots. The Sun News Network, and Ottawa Citizen also covered the Campaign while MAC has received a number of radio interview requests. Paul Hannon, Executive Director, was on CKNW Morning News with Philip Till.
One very exciting result of these activities is that The Globe and Mail’s editorial team has come out in support of the Campaign to Stop Killer Robots and our call:
The world has a long banned some weapons deemed dangerous, indiscriminate or inhumane, including chemical weapons and land mines. Autonomous robot weapons carry all such risks, and add new ones to the list. They are not wielded remotely by humans, but are intended to operate without supervision. They’re about turning life and death decisions over to software. Canada should be a leading voice advocating for a global protocol limiting their development and use.
Also Jian Ghomenshi on CBC Radio’s Q called for Canadian leadership on killer robots, he says that leadership on this issue is something Canadians could be proud of and that it could be a legacy issue for Prime Minister Stephen Harper.
The Keep Killer Robots Fiction initiative is off to a great start. You can get involved by signing and sharing the petition at: /KRpetition.
When I first applied for an internship position to work on the Campaign to Stop Killer Robots back in November, I knew virtually nothing on either the campaign or the killer robots issue. I chose the internship with Mines Action Canada as my top choice because it was the position which most closely related to my field of study: Conflict Analysis and Conflict Resolution. When submitting my application, I had a conversation with my fellow students on just what exactly were killer robots. The general consensus of the group was that killer robots had to be drones that were being militarily used in such countries as Pakistan and Yemen.
Since joining the International Campaign to Stop Killer Robots in January, I have had the privilege of being exposed to a new issue that has not been discussed by the general public or even most international affairs students. I learned about current development efforts by militaries to create robotic weapons which would have complete autonomy to choose whether or not to fire on a specified target without meaningful human control. Most disturbingly I learned that some countries (e.g. the United States, Israel, and several others) have not only taken steps to develop “human-out-of-the-loop weapons”, but that some current technologies could easily be adapted to become autonomous weapons. As a student studying in an international affairs program and as a concerned person, this issue raises human rights and humanitarian concerns.
The use of autonomous weapons is a troubling issue for human rights advocates and humanitarian organizations because it would make humans increasingly vulnerable in warfare where international law is not designed to accommodate autonomous weapons. First, how could the protection of civilians be guaranteed in times of combat? If human judgment is taken out of the battlefield, robots would be tasked with distinguishing armed combatants from ordinary citizens. In this scenario, would a robot have the capability to differentiate between a soldier holding a weapon from a child holding a toy gun? The potential to have such mistakes be made is likely to occur so long as robots are given higher autonomy and decision-making capabilities on the battlefield. Further, the development and use of autonomous weapons could pose serious issues of accountability in war. For example, if a robotic system was to go awry and end up massacring a village of non-combatants, who would be held accountable? Would it be the systems operator of the machine, the military, the computer programmer, or the manufacturer of the machine? Without military troops in the air, land, or sea, who can be held liable for the actions of robots in combat? Implementing the use of autonomous robots in war would severely reduce the legal protections civilians are accorded during conflict.
I am very concerned that putting autonomous weapons on the battlefield would change how wars are fought and conducted. Wars would no longer be fought by the military personnel of two opposing sides; but by autonomous weapons, capable of making their own ‘kill decision’, against human forces. Countries which have the financial means to develop autonomous weapons could threaten lesser developed countries who would bear the costs of higher human casualties on the battlefield. More importantly, the potential for an increase in future conflict will grow as the decision to enter into combat would be much easier for leaders to make as they would not have to bear the costs of human casualties. The concern here is that countries would be sending machines to fight against humans, instead of the traditional model of human versus human. As difficult as this may be to hear, it is only through the casualties of soldiers on the battlefield that we are able to see the true cost of warfare. Taking human sacrifice out of the battlefield could potentially cause an increase in future warfare.
As interest in the topic of killer robots in the international community grows, it is pertinent that students, and indeed all citizens, begin to discuss the development of autonomous robots for military use in their respective fields. Should silence continue not only in the academic community, but in the Canadian parliament and public domain, the potential for autonomous robots to make life and death decisions on the battlefield without human control may be realized. As one concerned student, and citizen, who has signed the petition to Keep Killer Robots Fiction, I strongly encourage all to Keep Killer Robots Fiction by not only gaining exposure and increasing their knowledge on the subject, but to join me in signing the petition at /KRpetition. Only through increased discussion and knowledge of this topic in the general community can pressure be mounted on governments to create a pre-emptive ban on this emerging threat.
Brett MacFarlane interned at Mines Action Canada and is a Master of the Arts Candidate at the Norman Paterson School of International Affairs at Carleton University specializing in Conflict Analysis and Conflict Resolution.
Come join us to launch the Campaign to Stop Killer Robots in Canada.
Join us at Ottawa City Hall for a panel discussion on fully autonomous weapons, led by Mines Action Canada and including guest speakers:
Where? : Ottawa City Hall – the Colonel By room
When? : April 28th, 7:00 pm (Doors at 6:45 pm)
Check out the Public Event Flyer for all the details.
There is nothing Canadian about machines that kill people without human control. Machines that have no conscience. Machines that have no compassion. Machines without the ability to distinguish between someone who is a genuine threat and someone in the wrong place at the wrong time.
We, as a people, have for many years sought to build a safer and more peaceful world. Former Prime Minister Brian Mulroney made Nelson Mandela and the end of apartheid in South Africa “the highest priority of the government of Canada in our foreign affairs.” Former Prime Minister Lester Pearson brought about modern peacekeeping in 1956. Former Foreign Affairs Minister Lloyd Axworthy gathered states in our nation’s capital to end the use of anti-personnel landmines around the world. These men understood that a desire for peace and justice is a basic Canadian value. That is not something a machine can ever understand.
This issue presents us as Canadians with an opportunity to share our values, and our vision for a safer world. Killer Robots are perhaps the most important international arms control issue to emerge since nuclear weapons were dropped on Hiroshima and Nagasaki. Nuclear weapons redefined how we understood and approached warfare. That is why it is so absolutely necessary for the world to confront the problem of killer robots before and not after they see action on the battlefield.
The costs of playing catch up are far too evident. Once weapons are employed, most countries will scramble to re-adjust for the change in balance in power. During World War I chemical weapons were used against Canadian soldiers causing blindness, death and unspeakable suffering. Nearly one hundred years later chemical weapons were being used in Syria causing death and significant harm to civilians. With thousands of casualties of chemical weapons in between, the difficulty of banning weapons once they have been put into use is quite evident.
History has shown that the support and leadership of our nation can bring about international change. We have a duty as moral entrepreneurs to prevent the horror of autonomous killing machines from ever becoming a reality.
In November 2013, states agreed to discuss the question of lethal autonomous robots in meetings of the Convention on Conventional Weapons in May, 2014. This umbrella agreement allows for 117 member states to consider issues of arms control.
But at the moment, the official Canadian government position on Killer Robots is unclear. A government statement in the February 2014 edition of L’actualite offers little insight. In the article, a Canadian Foreign Affairs spokesman indicated that Canada does not ban weapons that do not yet exist. But in fact, Canada has participated in a pre-emptive ban of weapons before.
In 1995, Canada was one of the original parties to Protocol IV of the Convention to Conventional Weapons. This international agreement banning blinding lasers was made in the very same forum in whichkiller robots are set to be discussed in May. This not only represents a step in the right direction but a precedent upon which to build.
If a pre-emptive ban has been done before, it can be done again. Whether a weapon exists yet or not should have no bearing on whether the technology should be illegal under international humanitarian law. What should matter is whether we as a people believe that these weapons can ever be considered to be humane. To me, and to many others, the answer to that question is clearly no.
If you feel that as Canadians we must take a stand, please join me in signing our petition to Keep Killer Robots Fiction.
Matthew Taylor is an intern at Mines Action Canada and is a Master of the Arts Candidate at the Norman Paterson School of International Affairs at Carleton University specializing in Intelligence and National Security.
In November 2013,the World Council of Churches made a statement that recommends governments to: “Declare their support for a pre-emptive ban on drones and other robotic weapons systems that will select and strike targets without human intervention when operating in fully autonomous mode;”.
Building on that recommendation, our colleagues in the Netherlands have launched an Interfaith Declaration that says:
we, as religious leaders, faith groups and faith-based organizations, raise our collective voice to
call on all governments to participate in the international debate on the issue, and to work
towards a ban on the development, production and use of fully autonomous weapons.
The team at PAX put together a Factsheet on the Interfaith Declaration and you can find even more information on their website.
We’re calling on all Canadian religious leaders, faith based organization and faith groups to support a ban on autonomous weapons and to sign the Interfaith Declaration. Here is the full text of the Declaration: Interfaith Declaration.pdf (EN) and Interfaith Declaration FR.pdf (FR). To sign the declaation digitally visit /stay-informed/news/interfaith-declaration or you can contact PAX directly at [email protected]. In addition to the Interfaith Declaration for religious leaders and faith groups, individuals can sign Mines Action Canada’s Keep Killer Robots Fiction petition.
Check it out and share your thoughts in the comments.
Now the team at PAX wasn’t content just to post an amazing video, they also released a new report today. In Deadly Decisions: 8 objections to killer robots, the team opens with a disconcerting quote from John Pike:
First, you had human beings without machines.
Then you had human beings with machines.
And finally you have machines without human beings.
After that the report outlines eight key objections to the development and use of killer robots. It is definitely worth a read: http://www.paxvoorvrede.nl/media/files/deadlydecisionsweb.pdf.
Great work PAX!