There were 18 experts invited to give presentations to the delegates and all of them were men. Now that might sound like a story line from the final season of Mad Men, but sadly we are talking about a large diplomatic meeting hosted by the United Nations in 2014, not the exploits of Sterling, Cooper, Draper, Pryce in 1965. The Campaign to Stop Killer Robots highlighted that the provisional agenda was unbalanced and suggested numerous possible experts who are leaders in their fields and who are women. And yet the panels proceeded as planned, leaving women, as Matthew Bolton put it, “literally condemned to the margins — only allowed to speak in civil society statements from the back of the room or ‘Side Events’.”
In the opening debate, civil society representatives and Norway commented on the gender disparity and later Christof Heyns, UN Special Rapporteur on Extra-Judicial Killings, also commented on the lack of women presenting. Throughout the meeting, women contributed greatly to the discussion through side-events, statements and interventions when permitted by the meeting’s chair. Also, many of the memos and papers provided by civil society were written or co-authored by women.
Civil society including the Campaign to Stop Killer Robots has taken action to address this anachronistic situation. Sarah Knuckey began compiling a list of women working, writing and speaking on autonomous weapons – the list currently includes over 25 names and growing. Article 36, a co-founder of the Campaign to Stop Killer Robots, is compiling a list of people working in the field of peace and security – particularly disarmament, arms control and the protection of civilians – who benefit from their male gender and have committed not to speak on panels that include only men. They say:
We believe that the practice of selecting only men to speak on panels in global policymaking forums is unjust. It excludes the voices of women and other gender identities from such events, running counter to UN Security Council Resolution 1325, which commits to inclusion of women in discussions on peace and security. Global policymaking efforts on peace and security – including disarmament, arms control and the protection of civilians – must include people of a diversity of gender identities.
Mines Action Canada supports this new effort and encourages others working in this field who identify as men to join the initiative. The gender disparity at the meeting was so glaring that Motherboard covered the issue and the story was picked up by i09. As someone with a passing interest in the construction of ideas and norms, the discussion surrounding this issue on io9 is very interesting. I read the internet comments so you don’t have to and there are a few aspects of that online conversation I would like to address.
First up is the frequent comment – why does gender matter when discussing autonomous weapons? Having only men invited to speak at the UN as experts on autonomous weapons and gender considerations at the CCW matters for a number of reasons. I feel ridiculous listing reasons why women should be included in global policy making forums since it is (as stated above) 2014 not 1965 but for brevity’s sake here’s a couple of reasons unique to the autonomous weapons discussion:
Another common line of commenting on this story was the idea that they got the best experts to present on these topics and unfortunately when it comes to things like science and engineering most of the experts are men. Well since this is not the place to discuss why there are more men than women in STEM fields, I’ll move on to the assertion that they got the best experts to present. I don’t have to say much because Sarah Knuckey’s list has made it quite clear there are a number of women who are top of their fields and “experts” on the subject matter discussed last week. But it is worth highlighting that the Harvard-based legal scholar who wrote the first report on the legal arguments surrounding autonomous weapons launching the global discussion (and who is a woman) was not included in either panel discussing legal issues. Another troubling part of this idea is the decisions over autonomy and human control in conflict should be only handled by experts in technical fields like computer science. The potential impact of autonomous weapons necessitates in-depth technical, legal, ethical and moral analysis. A perceived gender imbalance in STEM does not justify only hearing from men on all topics of discussion.
I have ignored many of the blatantly misogynistic comments on the io9 piece about the lack of women at CCW and the work of obvious trolls but there is one more theme in the comments I would like to address. More than one commenter stated something like “if they overlooked people that were more qualified to be present then it absolutely needs to be addressed [emphasis mine].” The idea that women have to be better than men before their opinion should be taken into consideration is rather insidious. It can be linked to the so-called confidence gap between men and women among other aspects of gender dynamics in the workplace. I see this idea even in my own life – just last week, I did extra reading prior to a meeting because I felt that, as a young woman, I needed to know the topic better than anyone else before they would take me seriously. One of the lessons I will take from this discussion of gender in global policy development spawned by the lack of women at the CCW meeting is that it is beyond time to ask the question why should a woman have to be more qualified rather than just as qualified as a man to be considered an expert?
Last week’s CCW meeting made much progress in the global discussion of autonomous weapons systems despite the regressive gender dynamics but we cannot continue on that path without recognizing the capabilities and expertise offered by women. We cannot continue to miss half the conversation. Civil society is taking action to improve gender representation in policy making and the media has recognized women as experts on this topic on numerous occasions so now it is up to the states. It is time for states to get serious about implementing Resolution 1325. It is time for states to hear more than half the story.
Update May 23: the International Committee for Robot Arms Control has listed their world leading female experts to prevent anyone using the excuse that there are no suitable women experts.
Last week, 87 states gathered in Geneva to discuss lethal autonomous weapons systems.
This Informal Experts Meeting ran from May 13 to May 16 and was the first international discussion on autonomous weapons systems. The meeting was focused on information rather than decision making. The 87 states attended the meeting under the Convention on Conventional Weapons (CCW) along with representatives from UN agencies including UNIDIR, the International Committee of the Red Cross (ICRC), and registered non-governmental organizations including the delegation of the Campaign to Stop Killer Robots.
The four day meeting included general debate and then substantive sessions with presentations from experts. The Chair’s summary showed that there is a willingness to pursue this topic and a possible issue for the next meetings would be the concept of meaningful human control. The options for going forward cited include exchange of information, development of best practices, moratorium on research, and a ban. The Campaign to Stop Killer Robots has a great piece about the meeting on their website.
Over the course of the week many states highlighted the importance of always maintaining meaningful human control over targeting and attack decisions. We are MAC were not only pleased that 5 countries have already called for a ban, but also that no country vigorously defended or argued for autonomous systems weapons although Czech Republic and Israel each spoke on the desirability of such systems.
Unlike most countries, Canada has not yet provided copies of their statements to Reaching Critical Will or to the United Nations so we have had to piece together the statements from the CCW Review and Twitter. On day 1, Canada was the only country to say that existing international humanitarian law is sufficient to regulate the use of autonomous weapons. It also said that the definition of autonomy is difficult as autonomy is subjective depending on the system. On day 2, Canada said that the moral aspects of autonomous weapons are important and must be part of discussions in CCW. It looks like Canada did not make any statements or interventions on Day 3. On day 4, Canada called for more discussion on the ethical and political issues including meaningful human control under the CCW. Canada also said humanitarian and state security concerns must be balanced in considering autonomous weapons – which is language usually heard from Russia, China and similar states.
Some of the presentations from the substantive sessions are available online:
Technological Issues – key topics included definitions of autonomy and meaningful human control. Included a debate between Ron Arkin who believes that it is pre-mature to ban autonomous weapons and Noel Sharkey who does not believe that computerised weapons without a human in control can fully comply with international humanitarian law in the foreseeable future.
Ethics and Sociology – key topics included if machines should make the decision to take a human life, the relevance of human judgement to international law and the need for human control.
Legal Issues (International Humanitarian Law) – key topics included definitions, whether or not autonomous weapons systems are inherently illegal, morality and military effectiveness. This was an extensive debate.
Legal Issues (other areas of international law) – key topics included human rights law, accountability and article 36 weapons reviews.
Operational and military issues – key topics included meaningful human control, military effectiveness and the nature of warfare.
The Campaign to Stop Killer Robots held side events each day to delve deeper into the issues at hand. These side events were well attended and lively discussions covered the topics at hand in greater depth.
While the meetings were progressing in Geneva here at the national level Mines Action Canada was working to ensure these historic sessions reached media coverage across Canada. For example:
CCW member states will reconvene in November to decide if they want to continue these talks. Until then Mines Action Canada and our colleagues in the international campaign will continue to push for a renewed and expanded mandate including continued discussions on meaningful human control over all targeting and firing decisions.
The two days started with an op-ed by Ian Kerr who holds the Canada Research Chair in Ethics, Law and Technology at the University of Ottawa and is a member of ICRAC.
On April 28th, we met with other peace, disarmament and development organizations to talk about the campaign and to begin to build a stronger civil society presence in Canada on this issue. There was a lot of a interest from our non-profit colleagues so we look forward to hearing more voices on this issue in the near future.
Later that day, we hosted a public event at Ottawa City Hall. There was a panel discussion with Peter, Paul, Mary and Ian followed by a rather lively Question and Answer session with the audience. The audience was generally quite supportive of the Campaign and our efforts to achieve a pre-emptive ban on autonomous weapons. Audience members with backgrounds in engineering, law, the military and politics all expressed concern about the development of killer robots.
The following morning, MAC hosted a breakfast briefing for parliamentarians and their staff, other NGOs and decision makers in Ottawa. The Bagels and ‘Bots breakfast was the first time some of these decision makers had heard of the issue and it seemed to strike a chord with many in the room. After breakfast, the team was off to Parliament Hill for a press conference. At the press conference and in MAC’s press release, campaigners called for Canadian leadership on this issue internationally and for Canada to be the first country in the world to declare a moratorium on the development and production of killer robots.
The media in Ottawa and across the country have taken quite an interest in these events. The Canadian Press story was picked up in newspapers across the country as well as national media outlets and there was an associated list of facts about killer robots. The Sun News Network, and Ottawa Citizen also covered the Campaign while MAC has received a number of radio interview requests. Paul Hannon, Executive Director, was on CKNW Morning News with Philip Till.
One very exciting result of these activities is that The Globe and Mail’s editorial team has come out in support of the Campaign to Stop Killer Robots and our call:
The world has a long banned some weapons deemed dangerous, indiscriminate or inhumane, including chemical weapons and land mines. Autonomous robot weapons carry all such risks, and add new ones to the list. They are not wielded remotely by humans, but are intended to operate without supervision. They’re about turning life and death decisions over to software. Canada should be a leading voice advocating for a global protocol limiting their development and use.
Also Jian Ghomenshi on CBC Radio’s Q called for Canadian leadership on killer robots, he says that leadership on this issue is something Canadians could be proud of and that it could be a legacy issue for Prime Minister Stephen Harper.
The Keep Killer Robots Fiction initiative is off to a great start. You can get involved by signing and sharing the petition at: /KRpetition.
When I first applied for an internship position to work on the Campaign to Stop Killer Robots back in November, I knew virtually nothing on either the campaign or the killer robots issue. I chose the internship with Mines Action Canada as my top choice because it was the position which most closely related to my field of study: Conflict Analysis and Conflict Resolution. When submitting my application, I had a conversation with my fellow students on just what exactly were killer robots. The general consensus of the group was that killer robots had to be drones that were being militarily used in such countries as Pakistan and Yemen.
Since joining the International Campaign to Stop Killer Robots in January, I have had the privilege of being exposed to a new issue that has not been discussed by the general public or even most international affairs students. I learned about current development efforts by militaries to create robotic weapons which would have complete autonomy to choose whether or not to fire on a specified target without meaningful human control. Most disturbingly I learned that some countries (e.g. the United States, Israel, and several others) have not only taken steps to develop “human-out-of-the-loop weapons”, but that some current technologies could easily be adapted to become autonomous weapons. As a student studying in an international affairs program and as a concerned person, this issue raises human rights and humanitarian concerns.
The use of autonomous weapons is a troubling issue for human rights advocates and humanitarian organizations because it would make humans increasingly vulnerable in warfare where international law is not designed to accommodate autonomous weapons. First, how could the protection of civilians be guaranteed in times of combat? If human judgment is taken out of the battlefield, robots would be tasked with distinguishing armed combatants from ordinary citizens. In this scenario, would a robot have the capability to differentiate between a soldier holding a weapon from a child holding a toy gun? The potential to have such mistakes be made is likely to occur so long as robots are given higher autonomy and decision-making capabilities on the battlefield. Further, the development and use of autonomous weapons could pose serious issues of accountability in war. For example, if a robotic system was to go awry and end up massacring a village of non-combatants, who would be held accountable? Would it be the systems operator of the machine, the military, the computer programmer, or the manufacturer of the machine? Without military troops in the air, land, or sea, who can be held liable for the actions of robots in combat? Implementing the use of autonomous robots in war would severely reduce the legal protections civilians are accorded during conflict.
I am very concerned that putting autonomous weapons on the battlefield would change how wars are fought and conducted. Wars would no longer be fought by the military personnel of two opposing sides; but by autonomous weapons, capable of making their own ‘kill decision’, against human forces. Countries which have the financial means to develop autonomous weapons could threaten lesser developed countries who would bear the costs of higher human casualties on the battlefield. More importantly, the potential for an increase in future conflict will grow as the decision to enter into combat would be much easier for leaders to make as they would not have to bear the costs of human casualties. The concern here is that countries would be sending machines to fight against humans, instead of the traditional model of human versus human. As difficult as this may be to hear, it is only through the casualties of soldiers on the battlefield that we are able to see the true cost of warfare. Taking human sacrifice out of the battlefield could potentially cause an increase in future warfare.
As interest in the topic of killer robots in the international community grows, it is pertinent that students, and indeed all citizens, begin to discuss the development of autonomous robots for military use in their respective fields. Should silence continue not only in the academic community, but in the Canadian parliament and public domain, the potential for autonomous robots to make life and death decisions on the battlefield without human control may be realized. As one concerned student, and citizen, who has signed the petition to Keep Killer Robots Fiction, I strongly encourage all to Keep Killer Robots Fiction by not only gaining exposure and increasing their knowledge on the subject, but to join me in signing the petition at /KRpetition. Only through increased discussion and knowledge of this topic in the general community can pressure be mounted on governments to create a pre-emptive ban on this emerging threat.
Brett MacFarlane interned at Mines Action Canada and is a Master of the Arts Candidate at the Norman Paterson School of International Affairs at Carleton University specializing in Conflict Analysis and Conflict Resolution.
There is nothing Canadian about machines that kill people without human control. Machines that have no conscience. Machines that have no compassion. Machines without the ability to distinguish between someone who is a genuine threat and someone in the wrong place at the wrong time.
We, as a people, have for many years sought to build a safer and more peaceful world. Former Prime Minister Brian Mulroney made Nelson Mandela and the end of apartheid in South Africa “the highest priority of the government of Canada in our foreign affairs.” Former Prime Minister Lester Pearson brought about modern peacekeeping in 1956. Former Foreign Affairs Minister Lloyd Axworthy gathered states in our nation’s capital to end the use of anti-personnel landmines around the world. These men understood that a desire for peace and justice is a basic Canadian value. That is not something a machine can ever understand.
This issue presents us as Canadians with an opportunity to share our values, and our vision for a safer world. Killer Robots are perhaps the most important international arms control issue to emerge since nuclear weapons were dropped on Hiroshima and Nagasaki. Nuclear weapons redefined how we understood and approached warfare. That is why it is so absolutely necessary for the world to confront the problem of killer robots before and not after they see action on the battlefield.
The costs of playing catch up are far too evident. Once weapons are employed, most countries will scramble to re-adjust for the change in balance in power. During World War I chemical weapons were used against Canadian soldiers causing blindness, death and unspeakable suffering. Nearly one hundred years later chemical weapons were being used in Syria causing death and significant harm to civilians. With thousands of casualties of chemical weapons in between, the difficulty of banning weapons once they have been put into use is quite evident.
History has shown that the support and leadership of our nation can bring about international change. We have a duty as moral entrepreneurs to prevent the horror of autonomous killing machines from ever becoming a reality.
In November 2013, states agreed to discuss the question of lethal autonomous robots in meetings of the Convention on Conventional Weapons in May, 2014. This umbrella agreement allows for 117 member states to consider issues of arms control.
But at the moment, the official Canadian government position on Killer Robots is unclear. A government statement in the February 2014 edition of L’actualite offers little insight. In the article, a Canadian Foreign Affairs spokesman indicated that Canada does not ban weapons that do not yet exist. But in fact, Canada has participated in a pre-emptive ban of weapons before.
In 1995, Canada was one of the original parties to Protocol IV of the Convention to Conventional Weapons. This international agreement banning blinding lasers was made in the very same forum in whichkiller robots are set to be discussed in May. This not only represents a step in the right direction but a precedent upon which to build.
If a pre-emptive ban has been done before, it can be done again. Whether a weapon exists yet or not should have no bearing on whether the technology should be illegal under international humanitarian law. What should matter is whether we as a people believe that these weapons can ever be considered to be humane. To me, and to many others, the answer to that question is clearly no.
If you feel that as Canadians we must take a stand, please join me in signing our petition to Keep Killer Robots Fiction.
Matthew Taylor is an intern at Mines Action Canada and is a Master of the Arts Candidate at the Norman Paterson School of International Affairs at Carleton University specializing in Intelligence and National Security.
In November 2013,the World Council of Churches made a statement that recommends governments to: “Declare their support for a pre-emptive ban on drones and other robotic weapons systems that will select and strike targets without human intervention when operating in fully autonomous mode;”.
Building on that recommendation, our colleagues in the Netherlands have launched an Interfaith Declaration that says:
we, as religious leaders, faith groups and faith-based organizations, raise our collective voice to
call on all governments to participate in the international debate on the issue, and to work
towards a ban on the development, production and use of fully autonomous weapons.
The team at PAX put together a Factsheet on the Interfaith Declaration and you can find even more information on their website.
We’re calling on all Canadian religious leaders, faith based organization and faith groups to support a ban on autonomous weapons and to sign the Interfaith Declaration. Here is the full text of the Declaration: Interfaith Declaration.pdf (EN) and Interfaith Declaration FR.pdf (FR). To sign the declaation digitally visit /stay-informed/news/interfaith-declaration or you can contact PAX directly at [email protected]. In addition to the Interfaith Declaration for religious leaders and faith groups, individuals can sign Mines Action Canada’s Keep Killer Robots Fiction petition.
Check it out and share your thoughts in the comments.
Now the team at PAX wasn’t content just to post an amazing video, they also released a new report today. In Deadly Decisions: 8 objections to killer robots, the team opens with a disconcerting quote from John Pike:
First, you had human beings without machines.
Then you had human beings with machines.
And finally you have machines without human beings.
After that the report outlines eight key objections to the development and use of killer robots. It is definitely worth a read: http://www.paxvoorvrede.nl/media/files/deadlydecisionsweb.pdf.
Great work PAX!
The Campaign to Stop Killer Robots was launched in April 2013 in London. Mines Action Canada is a co-founder of the campaign and a member of its Steering Committee along with other disarmament, human rights and humanitarian organizations.
In May, the first Human Rights Council debate on lethal autonomous robotics followed the presentation of a report by the UN special rapporteur, Christof Heyns, on extra-judicial killings. During the debate 20 governments make their views known for the first time.
A University of Massachusetts survey of 1,000 Americans found a majority oppose fully autonomous weapons and support actions to campaign against them. In August, the International Committee of the Red Cross issued a “new technologies” edition of its quarterly journal. The journal included articles by campaigners on fully autonomous weapons.
During the UN General Assembly First Committee on Disarmament and International Security in New York in October, 16 governments made statements on killer robots. Also in October, campaign member the International Committee for Robot Arms Control launched a letter from over 250 roboticists, scientists and other experts calling for a ban on autonomous weapons.
In November at the Convention on Conventional Weapons (CCW) in Geneva, 35 nations express their views on lethal autonomous weapons systems. States parties to the Convention on Conventional Weapons agreed to a mandate to begin work in 2014 on the emerging technology of “lethal autonomous weapons systems.”
Mines Action Canada (MAC) welcomed this historic decision to begin to address this issue. MAC encouraged all states to pursue an international ban on these weapons to ensure there will always be meaningful human control over targeting decisions and the use of violent force. We were also pleased that Canada made its first public statements on this topic during the CCW joining the other 43 nations who have spoken out on fully autonomous weapons since May. “ If we have learned anything from the Canadian led efforts to ban landmines, it is that the world cannot afford to wait until there is a humanitarian crisis to act. We need a pre-emptive ban on fully autonomous weapons before they can cause a humanitarian disaster,” said Paul Hannon, Executive Director, Mines Action Canada in a press release.
Our colleagues around the world have also seen exciting developments in their countries. The international campaign has put together a global recap.
Canada does not have a national policy on autonomous weapons. There are many reasons why Canada needs to have a policy on killer robots as soon as possible. This year, MAC looks forward to working with the Government of Canada to develop a national policy and to work towards an international treaty banning killer robots.
You can take action in 2014 by signing our Keep Killer Robots Fiction petition, by sharing the campaign website www.stopkillerrobots.ca and by donating to this new campaign.
All the discussions we’ve been having since the launch of the Campaign to Stop Killer Robots make me think about Alice in Wonderland and therefore I’ve been thinking a lot about rabbit holes. I feel like current technology has us poised at the edge of a rabbit hole and if we take that extra step and create fully autonomous weapons we are going to fall – down that rabbit hole into the unknown, down into a future where a machine could make the decision to kill you, down into a situation that science fiction books have been warning us about for decades.
The best way to prevent such a horrific fall is going to be to create laws and policies that will block off the entrance to the rabbit hole so to speak. At the moment, not many countries have policies to temporarily block the entrance and no one has laws to ban killer robots and close off the rabbit hole permanently. It is really only the US and the UK who have even put up warning signs and a little bit of chicken wire around the entrance to this rabbit hole of killer robots through recently released policies and statements.
Over the past few weeks our colleagues at Human Rights Watch (HRW) and Article 36 have released reports on the US and UK policies towards fully autonomous weapons (killer robots). HRW analyzed the 2012 US policy on autonomous weapons found in Department of Defense Directive Number 3000.09. You can find the full review online. Article 36 has a lot to say about the UK policy in their paper available online as well.
So naturally after reading these papers, I went in search of Canada’s policy. That search left me feeling a little like Alice lost in Wonderland just trying to keep my head or at least my sanity in the face of a policy that like the Cheshire Cat might not be all there.
After my futile search, it became even more important that we talk to the government to find out if Canada has a policy on fully autonomous weapons. Until those conversations happen, let’s see what we can learn from the US and UK policies and the analysis done by HRW and Article 36.
The US Policy
I like that the US Directive notes the risks to civilians including “unintended engagements” and failure. One key point that Human Rights Watch’s analysis highlights is that the Directive states that for up to 10 years the US Department of Defense can only develop and use fully autonomous weapons that have non-lethal force. The moratorium on lethal fully autonomous weapons is a good start but there are also some serious concerns about the inclusion of waivers that could override the moratorium. HRW believes that “[t]hese loopholes open the door to the development and use of fully autonomous weapons that could apply lethal force and thus have the potential to endanger civilians in armed conflict.”[1]
In summary Human Rights Watch believes that:
The Department of Defense Directive on autonomy in weapon systems has several positive elements that could have humanitarian benefits. It establishes that fully autonomous weapons are an important and pressing issue deserving of serious concern by the United States as well as other nations. It makes clear that fully autonomous weapons could pose grave dangers and are in need of restrictions or prohibitions. It is only valid for a limited time period of five to ten years, however, and contains a number of provisions that could weaken its intended effect considerably. The Directive’s restrictions regarding development and use can be waived under certain circumstances. In addition, the Directive highlights the challenges of designing adequate testing and technology, is subject to certain ambiguity, opens the door to proliferation, and applies only to the Department of Defense.[2]
In terms of what this all means for us in Canada, we can see there may be some aspects of the American policy that are worth adopting. The restrictions on the use of lethal force by fully autonomous weapons should be adopted by Canada to protect civilians from harm without the limited time period and waivers. I believe that Canadians would want to ensure that humans always make the final decision about who lives and who dies in combat.
The UK Policy
Now our friends at Article 36 have pointed out the UK situation is a little more convoluted – and they are not quite ready to call it a comprehensive policy but since “the UK assortment of policy-type statements” sounds ridiculous, for the purposes of this post I’m shortening it to the UK almost-policy with the hope that one day it will morph into a full policy. Unlike the US policy which is found in a neat little directive, the UK almost-policy is cobbled together from some statements and a note from the Ministry of Defense. You have a closer look at the Article 36 analysis of the almost-policy.
To sum up Article 36 outlines three main shortcomings of the UK almost-policy:
One of the most interesting points that Article 36 makes is the need for a definition of what human control over weapons systems means. If you are like me, you probably think that would be that humans get to make the decision to fire on a target making the final decision of who lives or who dies but we need to know exactly what governments mean when they say that humans will always been in control. The Campaign to Stop Killer Robots wants to ensure that there is always meaningful human control over lethal weapons systems.
Defining what we mean by meaningful human control is going to be a very large discussion that we want to have with governments, with civil society, with the military, with roboticists and with everyone else. This discussion will raise some very interesting moral and ethical questions especially since a two-star American general recently said that he thought it was “the ultimate human indignity to have a machine decide to kill you.” The problem is once that technology exists it is going to be incredibly difficult to know where that is going to go and how on earth we are going to get back up that rabbit hole. For us as Canadians it is key to start having that conversation as soon as possible so we don’t end up stumbling down the rabbit hole of fully autonomous weapons by accident.
- Erin Hunt, Program Officer
Non-governmental organizations convene to launch Campaign to Stop Killer Robots
(London, April 23, 2013) – Urgent action is needed to pre-emptively ban lethal robot weapons that would be able to select and attack targets without any human intervention, said a new campaign launched in London today. The Campaign to Stop Killer Robots is a coordinated international coalition of non-governmental organizations concerned with the implications of fully autonomous weapons, also called “killer robots.”
The Campaign to Stop Killer Robots calls for a pre-emptive and comprehensive ban on the development, production, and use of fully autonomous weapons. The prohibition should be achieved through an international treaty, as well as through national laws and other measures.
“Allowing life or death decisions on the battlefield to be made by machines crosses a fundamental moral line and represents an unacceptable application of technology,” said Nobel Peace Laureate Jody Williams of the Nobel Women’s Initiative. “Human control of autonomous weapons is essential to protect humanity from a new method of warfare that should never be allowed to come into existence.”
Over the past decade, the expanded use of unmanned armed vehicles or drones has dramatically changed warfare, bringing new humanitarian and legal challenges. Now rapid advances in technology are permitting the United States and other nations with high-tech militaries, including China, Israel, Russia, and the United Kingdom, to move toward systems that would give full combat autonomy to machines.
“Killer robots are not self-willed ‘Terminator’-style robots, but computer-directed weapons systems that once launched can identify targets and attack them without further human involvement,” said roboticist Noel Sharkey, chair of the International Committee for Robot Arms Control. “Using such weapons against an adaptive enemy in unanticipated circumstances and in an unstructured environment would be a grave military error. Computer controlled devices can be hacked, jammed, spoofed, or can be simply fooled and misdirected by humans.”
The Campaign to Stop Killer Robots seeks to provide a coordinated civil society response to the multiple challenges that fully autonomous weapons pose to humanity. It is concerned about weapons that operate on their own without human supervision. The campaign seeks to prohibit taking a human out-of-the-loop with respect to targeting and attack decisions on the battlefield.
“The capability of fully autonomous weapons to choose and fire on targets on their own poses a fundamental challenge to the protection of civilians and to compliance with international law,” said Steve Goose, Arms Division director at Human Rights Watch. “Nations concerned with keeping a human in the decision-making loop should acknowledge that international rules on fully autonomous weapons systems are urgently needed and work to achieve them.”
The UN Special Rapporteur on extrajudicial, summary or arbitrary executions for the Office of the High Commissioner for Human Rights, Professor Christof Heyns, is due to deliver his report on lethal autonomous robotics to the second session of the Human Rights Council in Geneva, starting May 27, 2013. The report is expected to contain recommendations for government action on fully autonomous weapons.
“One key lesson learned from the Canadian led initiative to ban landmines was that we should not wait until there is a global crisis before taking action.” said Paul Hannon, Executive Director of Mines Action Canada. “The time to act on killer robots is now”
“We cannot afford to sleepwalk into an acceptance of these weapons. New military technologies tend to be put in action before the wider society can assess the implications, but public debate on such a change to warfare is crucial,” said Thomas Nash, Director of Article 36. “A pre-emptive ban on lethal autonomous robots is both necessary and achievable, but only if action is taken now.”
The Campaign to Stop Killer Robots believes that humans should not delegate the responsibility of making lethal decisions to machines. It has multiple moral, legal, technical, and policy concerns with the prospect of fully autonomous weapons, including:
The Campaign to Stop Killer Robots includes several non-governmental organizations (NGOs) associated with the successful efforts to ban landmines, cluster munitions, and blinding lasers. Its members collectively have a wide range of expertise in robotics and science, aid and development, human rights, humanitarian disarmament, international law and diplomacy, and the empowerment of women, children, and persons with disabilities. The campaign is building a worldwide network of civil society contacts in countries including Canada, Egypt, Japan, The Netherlands, New Zealand, Pakistan, United Kingdom, and the United States.
The Steering Committee is the principal leadership and decision-making body for of the Campaign to Stop Killer Robots and is comprised of nine NGOs: five international NGOs Human Rights Watch, International Committee for Robot Arms Control, Nobel Women’s Initiative, Pugwash Conferences on Science & World Affairs, and Women’s International League for Peace and Freedom, and four national NGOs Article 36 (UK), Association for Aid and Relief Japan, Mines Action Canada, and IKV Pax Christi (The Netherlands).
The Campaign to Stop Killer Robots was established by representatives of seven of these NGOs at a meeting in New York on 19 October 2012. It is an inclusive and diverse coalition open to NGOs, community groups, and professional associations that support the campaign’s call for a ban and are willing to undertake actions and activities in support of the campaign’s objectives. The campaign’s initial coordinator is Mary Wareham of Human Rights Watch.
On Monday, April 22, the Steering Committee of the Campaign to Stop Killer Robots convened a day-long conference for 60 representatives from 33 NGOs from ten countries to discuss the potential harm that fully autonomous weapons could pose to civilians and to strategize on actions that could be taken at the national, regional, and international levels to ban the weapons.
Contact information for the Campaign to Stop Killer Robots:
To schedule a media interview (see list of spokespersons), please contact:
Video Footage
For more information, see:
List of Spokespersons
The following campaign spokespersons will be speaking at the launch events in London on 22-24 April and are available for interview on request. In addition, raw interview footage of Williams, Sharkey, Goose, and Docherty is available here: http://multimedia.hrw.org/distribute/hpgicavqly
Principal Spokespersons
Ms. Jody Williams – Nobel Women’s Initiative, @JodyWilliams97 @NobelWomen
Jody Williams received the Nobel Peace Prize in 1997 for her work to ban landmines through the International Campaign to Ban Landmines, which shared the Peace Prize. In January 2006, Jody established the Nobel Women’s Initiative together with five of her sister Nobel Peace laureates. In an April 2011 article for the International Journal of Intelligence Ethics, Nobel Peace Laureate Jody Williams calls for a ban on “fully autonomous attack and kill robotic weapons.” In March 2013, the University of California Press published a memoir on her work entitled My Name is Jody Williams: A Vermont Girl’s Winding Path to the Nobel Peace Prize. Williams can speak on why civil society is coming together and partnering with other actors to pursue a pre-emptive ban on fully autonomous weapons. Longer biography available here: /JKVvBd
Prof. Noel Sharkey – International Committee for Robot Arms Control, @StopTheRobotWar
Roboticist Noel Sharkey is Professor of Artificial Intelligence and Robotics and Professor of Public Engagement at the University of Sheffield. He is co-founder and chair of the International Committee for Robot Arms Control (ICRAC), a group of experts concerned with the pressing dangers that military robots pose to peace and international security. Sharkey can speak on the technology that the campaign is seeking to prohibit and its ethical implications. See also: /9fJQ7j
Mr. Steve Goose – Human Rights Watch, @hrw
Steve Goose is executive director of the Arms Division of Human Rights Watch and chair of the International Campaign to Ban Landmines and Cluster Munition Coalition (ICBL-CMC). Goose and Human Rights Watch were instrumental in bringing about the 2008 Convention on Cluster Munitions, the 1997 international treaty banning antipersonnel mines, the 1995 protocol banning blinding lasers, and the 2003 protocol on explosive remnants of war. Goose can speak on why a ban on fully autonomous weapons is necessary and achievable, and explain current US policy and practice. See also: /USEBZo
Mr. Thomas Nash – Article 36, @nashthomas @article36
Thomas Nash is director of Article 36 and joint coordinator of the International Network on Explosive Weapons. As Coordinator of the Cluster Munition Coalition from 2004 to 2011, Nash led the global civil society efforts to secure the Convention on Cluster Munitions. Nash can speak about civil society expectations of UK policy, practice, and diplomacy on fully autonomous weapons.
Ms. Mary Wareham – Human Rights Watch, @marywareham, @hrw
Mary Wareham is advocacy director of the Arms Division of Human Rights Watch and initial coordinator of the Campaign to Stop Killer Robots. She worked on the processes that created the Convention on Cluster Munitions and the Mine Ban Treaty, and has worked to ensure their universalization and implementation. Wareham can speak about the new Campaign to Stop Killer Robots and its initial plans.
Technical Experts
Dr. Jürgen Altmann - International Committee for Robot Arms Control
Jürgen Altmann is co-founder and vice-chair of the International Committee for Robot Arms Control. He is a physicist and peace researcher at Dortmund Technical University in Germany. Altmann has studied preventive arms control of new military technologies and new methods for the verification of disarmament agreements. He can speak about Germany’s policy and practice on fully autonomous weapons.
Dr. Peter Asaro – International Committee for Robot Arms Control, @peterasaro
Peter Asaro is co-founder and vice-chair of the International Committee for Robot Arms Control. He is a philosopher of technology who has worked in Artificial Intelligence, neural networks, natural language processing and robot vision research. Asaro is director of Graduate Programs for the School of Media Studies at The New School for Public Engagement in New York City. See also: /73JqBw
Ms. Bonnie Docherty – Human Rights Watch, @hrw
Bonnie Docherty is senior researcher in the Arms Division at Human Rights Watch and also a lecturer on law and senior clinical instructor at the International Human Rights Clinic at Harvard Law School. She has played an active role, as both lawyer and field researcher, in the campaign against cluster munitions. Docherty’s report Losing Humanity: The Case against Killer Robots outlines how fully autonomous weapons could violate the laws of war and undermine fundamental protections for civilians. See also: /103PV4t
Mr. Richard Moyes – Article 36, @rjmoyes @article36
Richard Moyes is a managing partner at Article 36 and an honorary fellow at the University of Exeter. He was previously director of policy at Action on Armed Violence (formerly Landmine Action) and served as co-chair of the Cluster Munition Coalition. Moyes can speak about civil society expectations of UK policy, practice, and diplomacy on fully autonomous weapons. See also: /103SAuS
Steering Committee members
Human Rights Watch, www.hrw.org
Human Rights Watch is serving as initial coordinator of the Campaign to Stop Killer Robots. Over the past two decades, the Arms Division of Human Rights Watch has been instrumental in enhancing protections for civilians affected by conflict, leading the International Campaign to Ban Landmines that resulted in the 1997 Mine Ban Treaty and the Cluster Munition Coalition, which spurred the 2008 Convention on Cluster Munitions. It also led the effort that resulted in the pre-emptive prohibition on blinding laser weapons in 1995. In November 2012, Human Rights Watch and Harvard Law School’s International Human Rights Clinic launched the report Losing Humanity: The Case against Killer Robots, the first in-depth report by a non-governmental organization on the challenges posed by fully autonomous weapons.
Article 36 (UK), www.article36.org
Article 36 is a UK-based not-for-profit organization working to prevent the unintended, unnecessary or unacceptable harm caused by certain weapons. It undertakes research, policy and advocacy and promotes civil society partnerships to respond to harm caused by existing weapons and to build a stronger framework to prevent harm as weapons are used or developed in the future. In March 2012, Article 36 called for a ban on military systems that are able to select and attack targets autonomously.
Association for Aid and Relief Japan, www.aarjapan.gr.jp
Association for Aid and Relief, Japan is an international non-governmental organization founded in Japan in 1979. As a committed member of the International Campaign to Ban Landmines, Association for Aid and Relief, Japan played a central role in convincing Japan to ban antipersonnel landmines and join the 1997 Mine Ban Treaty.
IKV Pax Christi (The Netherlands)- www.ikvpaxchristi.nl
IKV Pax Christi is a peace organization based in the Netherlands. It works with local partners in conflict areas and seeks political solutions to crises and armed conflicts. In May 2011, Dutch NGO IKV Pax Christi published a report entitled Does Unmanned Make Unacceptable? Exploring the Debate on using Drones and Robots in Warfare.
International Committee for Robot Arms Control, http://icrac.net
The International Committee for Robot Arms Control (ICRAC) is a not-for-profit organization comprised of scientists, ethicists, lawyers, roboticists, and other experts. It works to address the potential dangers involved with the development of armed military robots and autonomous weapons. Given the rapid pace of development of military robots and the pressing dangers their use poses to peace, international security, the rule of law, and to civilians, ICRAC supports a ban on armed robots with autonomous targeting capability.
Mines Action Canada, www.minesactioncanada.org
Mines Action Canada is a coalition of over 35 Canadian non-governmental organizations working in mine action, peace, development, labour, health and human rights that came together in 1994. It is the Canadian partner of the International Campaign to Ban Landmines and a founding member of the Cluster Munition Coalition.
Nobel Women’s Initiative, nobelwomensinitiative.org
The Nobel Women’s Initiative was established in January 2006 by 1997 Nobel Peace Laureate and five of her sister Nobel Peace laureates. The Nobel Women’s Initiative uses the prestige of the Nobel Peace Prize and of courageous women peace laureates to magnify the power and visibility of women working in countries around the world for peace, justice and equality. In an April 2011 article for the International Journal of Intelligence Ethics, Nobel Peace Laureate Jody Williams calls for a ban on “fully autonomous attack and kill robotic weapons.”
Pugwash Conferences on Science & World Affairs, www.pugwash.org
A central main objective of Pugwash is the elimination of all weapons of mass destruction (nuclear, chemical and biological) and of war as a social institution to settle international disputes. To that extent, peaceful resolution of conflicts through dialogue and mutual understanding is an essential part of Pugwash activities, that is particularly relevant when and where nuclear weapons and other weapons of mass destruction are deployed or could be used.
Women’s International League for Peace and Freedom www.wilpf.org
The Women’s International League for Peace and Freedom (WILPF) is the oldest women’s peace organization in the world. Its aims and principles include working toward world peace; total and universal disarmament; the abolition of violence and coercion in the settlement of conflict and their substitution in every case of negotiation and conciliation; the strengthening of the United Nations system; the continuous development and implementation of international law; political and social equality and economic equity; co-operation among all people; and an environmentally sustainable development.
# # #