Today Mines Action Canada Program Coordinator made an intervention during CCW discussions about autonomous weapons systems and weapons review processes.
Thank you Madame Chair. I would like to take this opportunity to share Mines Action Canada’s observations about Article 36 reviews.
Like many others, Mines Action Canada was concerned to learn that there was so little transparency around Article 36 weapons reviews at last year’s experts meeting. The fact that so few states were willing to discuss their weapons review process is a significant impediment to the prevention of humanitarian harm caused by new weapons. Indeed it seems that too few states actually undertake these reviews in a comprehensive manner.
Last year’s revelations concerning Article 36 reviews have made it clear that international discussions on the topic are necessary. Today is a start. States need to be more transparent in their weapons review processes. Sharing criteria and standards or setting international standards will do much to shed light on the shadowy world of arms procurement. Mines Action Canada believes that Article 36 weapons reviews should be a topic of discussion at the international level to strengthen both policy and practice around the world.
However, better weapons reviews will not solve the problems associated with autonomous weapons systems for a number of reasons.
First, there is the issue of timing. A successful international process to increase the effectiveness of weapons reviews will require a significant amount of time – time we do not have in the effort to prevent the use of autonomous weapons systems because technology is developing too rapidly.
Second, weapons reviews were designed for a very different type of weapon than autonomous weapon systems which have been called the third revolution in warfare. Autonomous weapons systems will blur the line between weapon and soldier to a level that may be beyond the ability of a weapons review process. In addition, the systemic complexity that will be required to operate such a weapons system is a far cry from the more linear processes found in current weapons.
Third, Article 36 reviews are not obligated to cover weapons used for domestic purposes outside of armed conflict such as policing, border control, or crowd control. Mines Action Canada, along with many civil society organizations and states present here, have serious concerns about the possible use of autonomous weapons systems in law enforcement and uses outside of armed conflict more generally.
Fourth and most importantly, weapons reviews cannot answer the moral questions surrounding delegating the kill decision to a machine. An Article 36 review cannot tell us if it is acceptable for an algorithm to kill without meaningful human control. And that is one of the key questions we are grappling with here this week.
Article 36 weapons reviews are a legal obligation for most of the states here. It is time for a separate effort to strengthen the standards and transparency around weapons reviews. That effort must neither distract from nor overtake our work here to deal with the real moral, legal, ethical and security problems associated with autonomous weapons systems. Weapons reviews must be supplemented by new and robust international law that clearly and deliberately puts meaningful human control at the centre of all new weapons development.
The concerns raised by autonomous weapons are urgent and must take priority. In fact, a GGE next year on autonomous weapons will greatly assist future work on weapons reviews by highlighting the many challenges new technologies pose for such reviews.
Overall, there is a need for international work to improve Article 36 reviews but there is little evidence to back up the claims of some states that weapons review processes would be sufficient to ensure that autonomous weapons systems are acceptable. Article 36 reviews are only useful once questions of the moral and ethical acceptability of a weapon have been dealt with. Until that time, it would be premature to view weapons review as a panacea to our issues here at CCW.
This week states are meeting at the United Nations in Geneva to decide if discussion on lethal autonomous weapons systems will continue at the Convention on Conventional Weapons. States should continue to discuss this issue and to debate key problems with autonomous weapons systems. One of the key problems is the issue of human control. Learn more with this new video.
Proportionality is a key term for international humanitarian law that means any collateral damage must be proportional to the military gain from any action. We are very concerned that fully autonomous weapons systems or killer robots won’t be able to weigh proportionality.
Last week, 87 states gathered in Geneva to discuss lethal autonomous weapons systems.
This Informal Experts Meeting ran from May 13 to May 16 and was the first international discussion on autonomous weapons systems. The meeting was focused on information rather than decision making. The 87 states attended the meeting under the Convention on Conventional Weapons (CCW) along with representatives from UN agencies including UNIDIR, the International Committee of the Red Cross (ICRC), and registered non-governmental organizations including the delegation of the Campaign to Stop Killer Robots.
The four day meeting included general debate and then substantive sessions with presentations from experts. The Chair’s summary showed that there is a willingness to pursue this topic and a possible issue for the next meetings would be the concept of meaningful human control. The options for going forward cited include exchange of information, development of best practices, moratorium on research, and a ban. The Campaign to Stop Killer Robots has a great piece about the meeting on their website.
Over the course of the week many states highlighted the importance of always maintaining meaningful human control over targeting and attack decisions. We are MAC were not only pleased that 5 countries have already called for a ban, but also that no country vigorously defended or argued for autonomous systems weapons although Czech Republic and Israel each spoke on the desirability of such systems.
Unlike most countries, Canada has not yet provided copies of their statements to Reaching Critical Will or to the United Nations so we have had to piece together the statements from the CCW Review and Twitter. On day 1, Canada was the only country to say that existing international humanitarian law is sufficient to regulate the use of autonomous weapons. It also said that the definition of autonomy is difficult as autonomy is subjective depending on the system. On day 2, Canada said that the moral aspects of autonomous weapons are important and must be part of discussions in CCW. It looks like Canada did not make any statements or interventions on Day 3. On day 4, Canada called for more discussion on the ethical and political issues including meaningful human control under the CCW. Canada also said humanitarian and state security concerns must be balanced in considering autonomous weapons – which is language usually heard from Russia, China and similar states.
Some of the presentations from the substantive sessions are available online:
Technological Issues – key topics included definitions of autonomy and meaningful human control. Included a debate between Ron Arkin who believes that it is pre-mature to ban autonomous weapons and Noel Sharkey who does not believe that computerised weapons without a human in control can fully comply with international humanitarian law in the foreseeable future.
Ethics and Sociology – key topics included if machines should make the decision to take a human life, the relevance of human judgement to international law and the need for human control.
Legal Issues (International Humanitarian Law) – key topics included definitions, whether or not autonomous weapons systems are inherently illegal, morality and military effectiveness. This was an extensive debate.
Legal Issues (other areas of international law) – key topics included human rights law, accountability and article 36 weapons reviews.
Operational and military issues – key topics included meaningful human control, military effectiveness and the nature of warfare.
The Campaign to Stop Killer Robots held side events each day to delve deeper into the issues at hand. These side events were well attended and lively discussions covered the topics at hand in greater depth.
While the meetings were progressing in Geneva here at the national level Mines Action Canada was working to ensure these historic sessions reached media coverage across Canada. For example:
- Paul Hannon was on Calgary’s News Talk 770 and News Talk 610 in St. Catherines.
- Erin Hunt was on Kevin Newman Live (starts 2:40 mark) and CFAX 1070 in Victoria (starts 6:07 mark).
- Dr. Ian Kerr was on Ontario Today – you should definitely check out the call of the day.
- Prof. Noel Sharkey was on CBC’s As It Happens (starts at 9:40 mark)
- The Globe and Mail, the Weather Network, Global News, CTV News, Ottawa Citizen and Metro also covered the issue while the Ottawa Citizen Defense Blog picked up our press release.
CCW member states will reconvene in November to decide if they want to continue these talks. Until then Mines Action Canada and our colleagues in the international campaign will continue to push for a renewed and expanded mandate including continued discussions on meaningful human control over all targeting and firing decisions.
by Brett MacFarlane
When I first applied for an internship position to work on the Campaign to Stop Killer Robots back in November, I knew virtually nothing on either the campaign or the killer robots issue. I chose the internship with Mines Action Canada as my top choice because it was the position which most closely related to my field of study: Conflict Analysis and Conflict Resolution. When submitting my application, I had a conversation with my fellow students on just what exactly were killer robots. The general consensus of the group was that killer robots had to be drones that were being militarily used in such countries as Pakistan and Yemen.
Since joining the International Campaign to Stop Killer Robots in January, I have had the privilege of being exposed to a new issue that has not been discussed by the general public or even most international affairs students. I learned about current development efforts by militaries to create robotic weapons which would have complete autonomy to choose whether or not to fire on a specified target without meaningful human control. Most disturbingly I learned that some countries (e.g. the United States, Israel, and several others) have not only taken steps to develop “human-out-of-the-loop weapons”, but that some current technologies could easily be adapted to become autonomous weapons. As a student studying in an international affairs program and as a concerned person, this issue raises human rights and humanitarian concerns.
The use of autonomous weapons is a troubling issue for human rights advocates and humanitarian organizations because it would make humans increasingly vulnerable in warfare where international law is not designed to accommodate autonomous weapons. First, how could the protection of civilians be guaranteed in times of combat? If human judgment is taken out of the battlefield, robots would be tasked with distinguishing armed combatants from ordinary citizens. In this scenario, would a robot have the capability to differentiate between a soldier holding a weapon from a child holding a toy gun? The potential to have such mistakes be made is likely to occur so long as robots are given higher autonomy and decision-making capabilities on the battlefield. Further, the development and use of autonomous weapons could pose serious issues of accountability in war. For example, if a robotic system was to go awry and end up massacring a village of non-combatants, who would be held accountable? Would it be the systems operator of the machine, the military, the computer programmer, or the manufacturer of the machine? Without military troops in the air, land, or sea, who can be held liable for the actions of robots in combat? Implementing the use of autonomous robots in war would severely reduce the legal protections civilians are accorded during conflict.
I am very concerned that putting autonomous weapons on the battlefield would change how wars are fought and conducted. Wars would no longer be fought by the military personnel of two opposing sides; but by autonomous weapons, capable of making their own ‘kill decision’, against human forces. Countries which have the financial means to develop autonomous weapons could threaten lesser developed countries who would bear the costs of higher human casualties on the battlefield. More importantly, the potential for an increase in future conflict will grow as the decision to enter into combat would be much easier for leaders to make as they would not have to bear the costs of human casualties. The concern here is that countries would be sending machines to fight against humans, instead of the traditional model of human versus human. As difficult as this may be to hear, it is only through the casualties of soldiers on the battlefield that we are able to see the true cost of warfare. Taking human sacrifice out of the battlefield could potentially cause an increase in future warfare.
As interest in the topic of killer robots in the international community grows, it is pertinent that students, and indeed all citizens, begin to discuss the development of autonomous robots for military use in their respective fields. Should silence continue not only in the academic community, but in the Canadian parliament and public domain, the potential for autonomous robots to make life and death decisions on the battlefield without human control may be realized. As one concerned student, and citizen, who has signed the petition to Keep Killer Robots Fiction, I strongly encourage all to Keep Killer Robots Fiction by not only gaining exposure and increasing their knowledge on the subject, but to join me in signing the petition at http://bit.ly/KRpetition. Only through increased discussion and knowledge of this topic in the general community can pressure be mounted on governments to create a pre-emptive ban on this emerging threat.
Brett MacFarlane interned at Mines Action Canada and is a Master of the Arts Candidate at the Norman Paterson School of International Affairs at Carleton University specializing in Conflict Analysis and Conflict Resolution.
Come join us to launch the Campaign to Stop Killer Robots in Canada.
Join us at Ottawa City Hall for a panel discussion on fully autonomous weapons, led by Mines Action Canada and including guest speakers:
- Ian Kerr – Professor, University of Ottawa, and Canada Research Chair for Ethics, Law and Technology (Ottawa, ON)
- Mary Wareham – Advocacy Director – Arms Division, Human Rights Watch, and Global Coordinator for the Campaign to Stop Killer Robots (Washington, DC)
- Peter Asaro – The New School for Public Engagement and Co-Founder and Co-Chair of the International Committee for Robot Arms Control (New York, NY)
- Paul Hannon – Executive Director, Mines Action Canada (Ottawa, ON)
Where? : Ottawa City Hall – the Colonel By room
When? : April 28th, 7:00 pm (Doors at 6:45 pm)
Check out the Public Event Flyer for all the details.
In the past we’ve posted about scientists, human rights advocates, disarmament organizations and politicians who have spoken out against killer robots and the support for a ban on autonomous weapons continues to grow. Faith groups, religious leaders and faith-based organizations are beginning to call for a ban on killer robots.
In November 2013,the World Council of Churches made a statement that recommends governments to: “Declare their support for a pre-emptive ban on drones and other robotic weapons systems that will select and strike targets without human intervention when operating in fully autonomous mode;”.
Building on that recommendation, our colleagues in the Netherlands have launched an Interfaith Declaration that says:
we, as religious leaders, faith groups and faith-based organizations, raise our collective voice to
call on all governments to participate in the international debate on the issue, and to work
towards a ban on the development, production and use of fully autonomous weapons.
We’re calling on all Canadian religious leaders, faith based organization and faith groups to support a ban on autonomous weapons and to sign the Interfaith Declaration. Here is the full text of the Declaration: Interfaith Declaration.pdf (EN) and Interfaith Declaration FR.pdf (FR). To sign the declaation digitally visit http://www.paxforpeace.nl/stay-informed/news/interfaith-declaration or you can contact PAX directly at firstname.lastname@example.org. In addition to the Interfaith Declaration for religious leaders and faith groups, individuals can sign Mines Action Canada’s Keep Killer Robots Fiction petition.
The Campaign to Stop Killer Robots has been trundling along all summer sharing our message, reaching out to governments and gaining new supporters,.
There have been some exciting and important developments over the summer. The International Committee of the Red Cross (ICRC) launched the newest edition of the International Review of the Red Cross and the theme is New Technologies and Warfare. A number of campaigners contributed to the journal so it is definitely worth a read. The ICRC also published a Frequently Asked Questions document on autonomous weapons that helps explain the issue and the ICRC’s position on fully autonomous weapons.
France along with the United Nations Office for Disarmament Affairs in Geneva convened a seminar on fully autonomous weapons for governments and civil society in early September. The Campaign to Stop Killer Robots had campaigners taking part and you can read the full report on the global campaign’s website.
The campaigns in Germany and Norway are starting off strong as well. In the lead up to the German election, all the major parties shared their policy positions in regards to fully autonomous weapons with our colleagues at Facing Finance. Norwegian campaigners launched their campaign with a breakfast seminar and now they are waiting to hear what the new Norwegian government’s policy on fully autonomous weapons will be.
Like our colleagues in Norway, we’re still waiting to hear what Canada’s policy on fully autonomous weapons will be. We have written to the Ministers of National Defense and of Foreign Affairs but the campaign team has not yet heard back. In the meantime, Canadians can weigh in on the topic through our new online petition. Share and sign the petition today! This petition is the first part of a new initiative that will be coming your way in a few weeks. Keep your eye out for the news and until then keep sharing the petition so that the government knows that Canadians have concerns about fully autonomous weapons and believe that Canada should have a strong policy against them.
EDIT: We had a very human moment here and forgot to include congratulations to James Foy of Vancouver for winning the 2013 Canadian Bar Association’s National Military Law Section Law School Sword and Scale Essay Prize for his essay called Autonomous Weapons Systems: Taking the Human out of International Humanitarian Law. It is great to see law students looking at this new topic and also wonderful that the Canadian Bar Association recognized the importance of this issue. Congratulations James!
Last month at the United Nations Human Rights Council, we were slightly concerned when the UK was the only state opposed to a moratorium or a ban on fully autonomous weapons. After a parliamentary debate on June 17, 2013, we have a little more clarity. In response to a speech by Nia Griffith, MP, the Minister for Counter Proliferation, Alistair Burt MP, agreed that fully autonomous weapons will not “be able to meet the requirements of international humanitarian law” and stressed that the UK does not have fully autonomous weapons and does not plan to acquire any.
Our colleagues at Article 36 have done a detailed analysis of the debate. In light of the stronger language in this debate, there is some room to optimistic
It would seem straightforward to move from such a strong national position to a formalised national moratorium and a leading role within an international process to prohibit such weapons. The government did not provide any reason as to why a moratorium would be inappropriate, other than to speculate on the level of support amongst other countries for such a course of action.
Whilst significant issues still require more detailed elaboration, Article 36 believes this parliamentary debate has been very valuable in prompting reflection and Ministerial scrutiny of UK policy on fully autonomous weapons and narrowing down the areas on which further discussions should focus. It appears clear now that there will be scope for such discussions to take place with the UK and other states in the near future.
The UK parliamentary debate and Article 36’s analysis of it, coming so soon after the Human Rights Council debate and the widespread media coverage of the issue make it quite clear that it is time to have such a substantive and non-partisan debate in the Canadian House of Commons as the government works out its policy on this important issue.
- Avoiding Rabbit Holes Through Policy and Law (stopkillerrobots.ca)
All the discussions we’ve been having since the launch of the Campaign to Stop Killer Robots make me think about Alice in Wonderland and therefore I’ve been thinking a lot about rabbit holes. I feel like current technology has us poised at the edge of a rabbit hole and if we take that extra step and create fully autonomous weapons we are going to fall – down that rabbit hole into the unknown, down into a future where a machine could make the decision to kill you, down into a situation that science fiction books have been warning us about for decades.
The best way to prevent such a horrific fall is going to be to create laws and policies that will block off the entrance to the rabbit hole so to speak. At the moment, not many countries have policies to temporarily block the entrance and no one has laws to ban killer robots and close off the rabbit hole permanently. It is really only the US and the UK who have even put up warning signs and a little bit of chicken wire around the entrance to this rabbit hole of killer robots through recently released policies and statements.
Over the past few weeks our colleagues at Human Rights Watch (HRW) and Article 36 have released reports on the US and UK policies towards fully autonomous weapons (killer robots). HRW analyzed the 2012 US policy on autonomous weapons found in Department of Defense Directive Number 3000.09. You can find the full review online. Article 36 has a lot to say about the UK policy in their paper available online as well.
So naturally after reading these papers, I went in search of Canada’s policy. That search left me feeling a little like Alice lost in Wonderland just trying to keep my head or at least my sanity in the face of a policy that like the Cheshire Cat might not be all there.
After my futile search, it became even more important that we talk to the government to find out if Canada has a policy on fully autonomous weapons. Until those conversations happen, let’s see what we can learn from the US and UK policies and the analysis done by HRW and Article 36.
The US Policy
I like that the US Directive notes the risks to civilians including “unintended engagements” and failure. One key point that Human Rights Watch’s analysis highlights is that the Directive states that for up to 10 years the US Department of Defense can only develop and use fully autonomous weapons that have non-lethal force. The moratorium on lethal fully autonomous weapons is a good start but there are also some serious concerns about the inclusion of waivers that could override the moratorium. HRW believes that “[t]hese loopholes open the door to the development and use of fully autonomous weapons that could apply lethal force and thus have the potential to endanger civilians in armed conflict.”
In summary Human Rights Watch believes that:
The Department of Defense Directive on autonomy in weapon systems has several positive elements that could have humanitarian benefits. It establishes that fully autonomous weapons are an important and pressing issue deserving of serious concern by the United States as well as other nations. It makes clear that fully autonomous weapons could pose grave dangers and are in need of restrictions or prohibitions. It is only valid for a limited time period of five to ten years, however, and contains a number of provisions that could weaken its intended effect considerably. The Directive’s restrictions regarding development and use can be waived under certain circumstances. In addition, the Directive highlights the challenges of designing adequate testing and technology, is subject to certain ambiguity, opens the door to proliferation, and applies only to the Department of Defense.
In terms of what this all means for us in Canada, we can see there may be some aspects of the American policy that are worth adopting. The restrictions on the use of lethal force by fully autonomous weapons should be adopted by Canada to protect civilians from harm without the limited time period and waivers. I believe that Canadians would want to ensure that humans always make the final decision about who lives and who dies in combat.
The UK Policy
Now our friends at Article 36 have pointed out the UK situation is a little more convoluted – and they are not quite ready to call it a comprehensive policy but since “the UK assortment of policy-type statements” sounds ridiculous, for the purposes of this post I’m shortening it to the UK almost-policy with the hope that one day it will morph into a full policy. Unlike the US policy which is found in a neat little directive, the UK almost-policy is cobbled together from some statements and a note from the Ministry of Defense. You have a closer look at the Article 36 analysis of the almost-policy.
To sum up Article 36 outlines three main shortcomings of the UK almost-policy:
- The policy does not set out what is meant by human control over weapon systems.
- The policy does not prevent the future development of fully autonomous weapons.
- The policy says that existing international law is sufficient to “regulate the use” of autonomous weapons.
One of the most interesting points that Article 36 makes is the need for a definition of what human control over weapons systems means. If you are like me, you probably think that would be that humans get to make the decision to fire on a target making the final decision of who lives or who dies but we need to know exactly what governments mean when they say that humans will always been in control. The Campaign to Stop Killer Robots wants to ensure that there is always meaningful human control over lethal weapons systems.
Defining what we mean by meaningful human control is going to be a very large discussion that we want to have with governments, with civil society, with the military, with roboticists and with everyone else. This discussion will raise some very interesting moral and ethical questions especially since a two-star American general recently said that he thought it was “the ultimate human indignity to have a machine decide to kill you.” The problem is once that technology exists it is going to be incredibly difficult to know where that is going to go and how on earth we are going to get back up that rabbit hole. For us as Canadians it is key to start having that conversation as soon as possible so we don’t end up stumbling down the rabbit hole of fully autonomous weapons by accident.
– Erin Hunt, Program Officer