Monthly Archives: April 2015
After the last informal meeting of experts in Geneva on killer robots (or as they prefer to call them “lethal autonomous weapon systems”) wrapped up it is an appropriate time to take stock of what we learned from the conference. A lot of ground was covered in Geneva, too much to cover in one short blog post, but there were a few ideas that received a lot of attention that are worth mentioning here.
First and foremost the idea of ‘meaningful human control’ got a lot of attention from all sides in the debate. So what is meaningful human control and how does that impact the debate on killer robots? Simply put, meaningful human control means that a human will always be the one that makes the decision whether or not to use force. There are three ways in which these systems are often described: human ‘in the loop’, human ‘on the loop’ and human ‘out of the loop’. A system with humans ‘out of the loop’ is the type of system that can target and use force without any human control and is the type of system that the Campaign to Stop Killer Robots seeks to ban. Systems with humans ‘on the loop’ give humans the ability to monitor the activity of the weapon and stop it if necessary. However, these systems may not furnish the decision maker with enough time to assess the information reported by the weapon. Finally, systems with humans ‘in the loop’ are more akin to traditional weapon systems, where the decision to use force rests firmly with a human operator.
The discussion of meaningful human control was linked to discussions about whether or not it was ethical or moral to delegate life and death decisions to machines. Some criticize this approach on the basis that meaningful human control isn’t a legal standard, or is too vague, but that criticism misses the point. This moral and ethical consideration is at the heart of the debate on killer robots; if only strict legal standards were applied then the ability and function of the technology would begin to determine how it is used. Strictly applying legal standards may approve the use of killer robots in areas that seemingly have no impact on civilians such as in outer space. Once such a precedent was set it would be difficult to stop the full use of killer robots.
After meaningful human control, the arguments made against a pre-emptive ban on killer robots formed a consistent theme throughout the conference, no matter the specific subject at hand. The refrain goes something like this, “We don’t know how this technology will evolve, so a pre-emptive ban could deprive the world of potentially useful technologies”. There is a concrete example of this not happening (the ban on blinding laser weapons), and various other treaties with dual-use implications have proven that banning a class of weapon does not adversely impact commercial or industrial activity. The Chemical Weapons Convention, which was discussed, provides a good example of how an export-control regime and competent verification can stop the spread of chemical weapons, while maintaining the ability of states to develop chemical industries.
Clearly then, neither of these two things should stop us from a pre-emptive ban on killer robots. As a co-founder of the Campaign to Stop Killer Robots, Mines Action Canada encourages all of you to engage with the issue and to advocate for a ban with your friends, family, local politician and anyone else who wants to listen. An easy way to start would be signing and sharing our petition to Keep Killer Robots Fiction here: http://killerrobots-minesactioncanada.nationbuilder.com/.
Michael Binnington is a M.A. Candidate at Norman Paterson School of International Affairs and a Research Associate at Mines Action Canada.
Executive Director Paul Hannon delivered our closing statement at the Convention on Conventional Weapons today. Download the statement here or read it below.
The Way Forward
Thank you Mr. Chair and your team for the strong foundation to move forward with the urgency and focus this issue requires. This week we have seen wide-ranging discussions on autonomous weapons systems. The CCW does not often enough deal with issues of morality, human rights and ethics. We welcome all states who have asserted the necessity of maintaining meaningful human control over the use of force. These conversations should continue and deepen.
There is one issue we would like to raise as food for thought. At times during the week, we have felt that some have underestimated the skills, knowledge, intelligence, training, experience, humanity and morality that men and women in uniform combine with situational awareness and IHL to make decisions during conflict. We work closely with roboticists, engineers, and technical experts and despite their expertise and the high quality of their work we do not believe an algorithm could replicate this complex decision making process. Robotics should only be used to inform and supplement human decision making. To go further than that risks “dehumanizing those we expose to harm” as RCW’s CCW Report’s editorial stated yesterday.
Allow me to conclude with the assertion that the international response to the possibility of autonomous weapons systems must not be limited to transparency alone. The expert presentations and the debates this week have strengthened our belief that autonomous weapons systems are not a typical new weapon and our current IHL and weapons review processes will not be sufficient. A mandate for a group of governmental experts next year is an appropriate and obvious next step. We look forward to working with the high contracting parties to ensure that meaningful human control remains at the centre of all decisions to use violent force.
Today at the Convention on Conventional Weapons meeting about lethal autonomous weapons systems, Mines Action Canada released a new memo to delegates on the impact of autonomous weapons systems on public trust in robotics. In this memo we discuss how the creation and use of autonomous weapons systems could change public perception of robotics more generally. Read the memo here and let us know what you think!
Will the use of killer robots make you more or less likely to want other autonomous robots in your life?
With the Convention on Conventional Weapons high contracting parties meeting this week to discuss lethal autonomous weapons systems, Mines Action Canada has released an updated memo to delegates on CCW Protocol IV which pre-emptively banned blinding laser weapons.
Please down load the Updated Protocol IV Memo.
Mines Action Canada delivered an opening statement at the Convention on Conventional this afternoon. The text of the statement is available online here.
Opening Statement – Convention on Conventional Weapons 13 April 2015
Thank you Mr.Chair. I appreciate the opportunity to speak on behalf of Mines Action Canada. Mines Action Canada is a Canadian disarmament organization that has been working to reduce the impact of indiscriminate weapons for over twenty years. For years we have worked with partners around the world including here at the CCW to respond to the global crisis caused by landmines and cluster munitions. We have seen that the international community can come together to respond to a humanitarian catastrophe and can create international humanitarian law to protect civilians often after the fact due to the changing nature of conflict and technological advances. However, we are here today in an attempt to look forward. We are looking at future weapons that will require new legal instruments to prevent future catastrophes. Throughout this week I hope we will keep my grandmother’s advice in mind: an ounce of prevention is worth a pound of cure.
As a co-founder of the Campaign to Stop Killer Robots, Mines Action Canada is very conscious of public opinion concerning autonomous weapons systems. Since last year’s discussions here at the CCW, opposition to autonomous weapons systems has grown in Canada. In addition to our member organizations, academics, parliamentarians, industry, faith communities and members of the general public have expressed concern about the potential humanitarian impacts of autonomous weapons systems. The widespread opposition to this technology indicates that there may be negative consequences for robotics more generally should autonomous weapons systems be used in armed conflict or in other circumstances. The erosion of public trust in robotic systems and autonomy as a result of the use of autonomous weapons systems could severely limit our ability to harness the good that robotics could do for humanity.
In addition to these concerns about the impact on public trust in robotics, we have numerous legal, moral, ethical, technical, military, political and humanitarian concerns about autonomous weapons systems which have led to the conclusion that new international humanitarian law is needed to ensure meaningful human control over these and other weapons. There is a moral imperative to consider the long term effects of the development and deployment of autonomous weapons systems on human society. Proponents of these technologies cite possible battlefield benefits and yet a discussion only dealing with short term or battlefield effects is not enough. We must ask the difficult questions: is it acceptable to cede decisions over life and death in conflict to machines? Who would be accountable for autonomous weapons systems? How can IHL adapt when new technology blurs the line between combatant and weapon?
IHL has demonstrated an ability to adapt and evolve to prevent the development and deployment of new and unnecessarily harmful technology. CCW Protocol IV banning blinding laser weapons is a good example which demonstrates that not only is there a need to add to IHL to address new technology, but also that we can prevent the development and use of weapons before their unacceptable humanitarian consequences create a catastrophe. We have published a memo to delegates which further explores the lessons learned from Protocol IV.
Autonomous weapons systems are not your average new weapon; they have the potential to fundamentally alter the nature of conflict. As a “game-changer” autonomous weapons deserve a serious and in-depth discussion. We hope that this week will see attempts to define meaningful human control and will foster a strong desire to pursue discussions towards a new legal instrument that places meaningful human control at the centre all decisions to use violent force.