Category Archives: Uncategorized
On August 29, 2018, Mines Action Canada delivered the following statement at the Convention on Conventional Weapons Group of Governmental Experts on Autonomous Weapons Systems
Thank you Mr. Chair. As co-founders of the Campaign to Stop Killer Robots, Mines Action Canada urge states to start negotiating new international law to create a new treaty to ban fully autonomous weapons and retain meaningful human control over the use of force.
As an organization from Canada, which has put a focus on Artificial Intelligence as a driver of our future economy, we see the prohibition of autonomous weapons systems as safeguarding public trust in AI and robotics.
This year’s Canadian trust survey by Proof – a polling and public relations firm – found that that only 39 per cent of Canadians trust that artificial intelligence will contribute positively to the Canadian economy, and even fewer women believe this to be true, at 36 per cent. Also, it found that only 25% of Canadians trust AI companies to do what is best for Canada and Canadians. These levels of public trust will present a problem for the commercial success of AI in the future.
Public trust in the technology is absolutely crucial to the transition from cool techy thing to an integral part of our lives. If the technology is weaponized, it will be so much harder to become useful part of our lives.
At yesterday’s side event, the panelist from the Future of Life Institute clearly outlined how scientists and relevant subject matter experts are concerned about the risks to the reputation of new technology from autonomous weapons systems and they are not worried about any risks a ban would pose to dual use technology. Protocol 4 of the CCW and the Chemical Weapons Convention has shown us that weapons can be prohibited without risking the development of beneficial technology.
I will conclude with a few words about the concept of risk. We have spent over five years, discussing autonomous weapons systems here at the CCW and throughout these talks, experts from around the world have outlined the risks posed by autonomous weapon systems – technological risks, humanitarian risks, security risks, moral risks and political risks. It is very clear that there are significant risks in developing autonomous weapons systems.
As we heard at a side event in April, the financial community makes its decisions based on risk, if an investment is too risky, you don’t do it even if the potential for a big payoff is there. We are constantly surprised that after hearing so many expert assessments that this technology poses high risks to civilian populations, some states are still object to development new IHL to prohibit AWS because maybe there will be tangential benefits from the technology.
It’s one thing to risk money, it’s another completely to risk other people’s lives.
High contracting parties should make the responsible choice in the face of overwhelming risk and start negotiating new international law to create a new treaty to ban fully autonomous weapons.
Mines Action Canada, Project Ploughshares, the Women’s International League for Peace and Freedom, the International Committee for Robot Arms Control with the Government of Canada are hosting a briefing event during the Convention on Certain Conventional Weapons Group of Governmental Experts Meeting in Geneva. The lunch time event will be held on Thursday August 30 and for more details please see the flyer.
Our popular Keep Killer Robots Fiction T-shirts have been re-launched and new items are available. For a limited time only you can get your Keep Killer Robots Fiction t-shirt in three styles as well as Keep Killer Robots Fiction mugs and tote bags. Visit www.teespring.com/keepkillerrobotsfiction2017 to purchase yours today.
The t-shirts, totes and mugs are only available until November 6th so order today!
Advocates of a ban on killer robots can learn from the new Treaty on the Prohibition of Nuclear Weapons
On July 7 2017, 122 states adopted a new treaty prohibiting nuclear weapons at the United Nations in New York. The Treaty on the Prohibition of Nuclear Weapons bans the development, possession, stockpiling, transfer, use and threat of use of nuclear weapons while also requiring states to assist victims of nuclear weapons use and testing as well as to remediate affected environments.
This treaty came about after years of work by states and civil society. In the face of disarmament efforts that had been stalled for decades, civil society and like-minded states reframed nuclear disarmament from an arms control question to a humanitarian issue. The humanitarian framing was a key driver in the Humanitarian Initiative that led to the negotiations just as it had been for the processes prohibiting anti-personnel landmines and cluster munitions.
Just as advocates for nuclear disarmament were able to learn from previous campaigns to ban landmines and cluster munitions, future disarmament campaigns can learn many lessons from this process leading up to the Treaty’s adoption. In particular, it can provide a number of important lessons for civil society and states working towards a pre-emptive ban on autonomous weapons systems.
First, the nuclear ban process demonstrated that it is much easier to prohibit weapons before they are used. It took 71 years, 11 months and 1 day to adopt a treaty prohibiting nuclear weapons after their first use in Hiroshima, Japan. In that time, nine other states developed nuclear weapons (South Africa later dismantled its program), nuclear weapons were tested over 1,000 times and the global nuclear arsenal ballooned to over 60,000 before dropping to the current total of approximately 15,000. We also cannot forget that the first United Nations resolution ever agreed to, back in 1946, was in support of nuclear disarmament. Over the past seven decades, countless hours and dollars have been spent trying to prohibit and eliminate nuclear weapons.
When it comes to autonomous weapons, a similar time frame could result in countless casualties and unimaginable humanitarian harm. Looking back now on what it has taken to get to this place regarding nuclear weapons and how much further there is go, the only logical conclusion is that we should have prohibited nuclear weapons in 1945 – it would have saved many lives and significant amounts of money.
Second, conversations about the humanitarian impact, the legality and morality of weapons are useful even when others want to talk about “hard security” because they open space both for conversations and for other actors. Nuclear disarmament has often been characterized as the ultimate hard security topic – one that has be talked about in “serious” state security and arms control terms. However, the process that led to the nuclear ban treaty framed the conversation around the humanitarian impact of nuclear weapons.
By placing human security at the centre of these discussions, the humanitarian initiative created new space to consider the prohibition of nuclear weapons. All states had a stake in the negotiations because of the trans-border nature of the humanitarian consequences of nuclear weapons. Whether they were part of a Nuclear Weapons Free Zone or neighbor to a nuclear armed state, all states who participated saw that nuclear weapons were a threat to their citizens. Prioritizing the security of citizens changed the conversation and a similar shift could be expected within the discussion of autonomous weapons systems.
The morality of nuclear weapons also figured heavily in the Humanitarian Initiative. Faith groups especially highlighted the morally unacceptable basis of nuclear deterrence. The moral and ethical issues surrounding autonomous weapons systems have been a topic of conversation at the national and international level from the start and this should continue. As humanity we cannot overlook questions about the morality and ethics of weapons. It is often the moral and ethical arguments that motivate states to disavow weapons that may have some perceived military utility.
Third, being rational and realistic is not exclusive to those focusing on traditional state security. Disarmament advocates and states supporting the nuclear ban treaty had realistic and security focused reasons for pursuing nuclear disarmament and the treaty, however, they were dismissed as being driven by emotions and values not rational thought. People who claim to be “rational” or “realistic” because they are talking about state security are just as emotionally and values driven as those taking a humanitarian approach. Critics of civil society and of the ban treaty process often accused them of being emotionally driven while those supporting the status quo were considered “rational” and “realistic”. Beyond the gendered narrative constructed by these assertions, the important thing to remember is that even those who are focusing on hard or state security are motivated by values. The values may be different, but that does not preclude that they are emotionally driven.
Fourth, treaties that construct norms can have an impact. The Treaty Prohibiting Nuclear Weapons has been called a constructivist treaty because it seeks to further construct the international norm against nuclear weapons. The nuclear weapon states’ reaction to the treaty shows that such constructivist treaties can have an impact. Treaties are only legally binding on the states which the join them, however, it is evident that the normative or constructivist aspects of the treaty will have a wide-ranging impact. Based on the reactions to the nuclear ban treaty by the nuclear weapons states especially the United Kingdom, France and the United States, they expect that the norms set by the Treaty will have an impact on global perception of nuclear weapons and their self-perceived status as legitimate possessors of nuclear weapons.
Fifth, diplomats and governments should listen to scientists. After the development of the first nuclear weapons, scientists from a variety of disciplines appealed to world leaders to “remember your humanity, and forget the rest” and end the threat of use of nuclear weapons. That appeal went unheard and decades later, over 3,000 scientists signed an open letter in support of the 2017 negotiations. The scientists should have been listened to in the 1940s. Now, we see thousands of AI experts, roboticists, computer scientists and others speaking out and calling for a pre-emptive ban on autonomous weapons systems. Many of these experts do not want to see their work corrupted into harming people. Their concerns are serious and should be treated as such.
Sixth, colonial views still permeate international affairs. For decades the only opinions that seemed to matter about nuclear weapons were those of the five nuclear weapon states (the US, the UK, France, Russia and China) and occasionally their allies. The security concerns of the nuclear armed states were paramount over any concerns by the vast majority of the planet. Even after the Humanitarian Initiative began to level the playing field or as Costa Rica put it, “democracy has come to nuclear disarmament”, the nuclear armed states still continued to question the validity of other states’ concerns. During informal comments in the United Nations General Assembly’s First Committee in 2016, the United Kingdom’s ambassador stated that the states pressing for a nuclear ban treaty did not have security concerns unlike the states who opposed the negotiations which did. It goes without saying that all states have security concerns.
These colonialist hangovers also had an impact on who were considered acceptable guardians of weapons capable of destroying humanity. Built into the conversations about nuclear weapons is the idea that they are bad for most states to have, but that a select few states can have them without it being a problem. However, as the Humanitarian Initiative has shown there are no safe or responsible hands for nuclear weapons. The risk of another nuclear detonation, intentional or not, is far too high.
Colonialist views can be seen in the assumption that only “we” will have autonomous weapon systems and the reasons “we” might need them are more valid than reasons for opposing them. In the nuclear ban context, these views were overcome by a humanitarian approach and by meetings which were open to all states and blockable by none. Unlike a number of other United Nations bodies that deal with nuclear disarmament, the 2016 Open Ended Working Group and the 2017 negotiating conference were open to all states as well as international organizations and civil society. These meetings originated in the United Nations General Assembly which provided rules of procedure predicated on equality between all participating states. International efforts to pre-emptively prohibit autonomous weapons systems should aim for similar levels of openness and inclusivity to ensure that all states have a voice on this issue that will affect us all.
These lessons from the nuclear ban treaty process reinforce lessons from the Ottawa Treaty and from the Convention on Cluster Munitions. Lessons from the Ottawa Treaty showed that the international community should not wait until there is a humanitarian crisis with huge stockpiles and thousands of casualties and the Treaty on the Prohibition of Nuclear Weapons puts that lesson in to practice. Other lessons from the Ottawa Treaty such as action is possible even when large states are not participating and the importance of including survivors in the process were applied successfully in the nuclear ban process. Learning from and building on previous humanitarian disarmament processes will make future efforts much more likely to succeed.
The Humanitarian Initiative and the Treaty on the Prohibition of Nuclear Weapons have provided a number of lessons for campaigners and states seeking to pre-emptively prohibit autonomous weapons systems. More importantly perhaps this process has shown it is possible to create new international law even when some states are strongly opposed to the idea. The power of civil society and like-minded states to create change should not be underestimated.
The 5th Review Conference of the Convention on Conventional Weapons (CCW) wrapped up today in Geneva and we’re very pleased that states agreed to hold two weeks of formal meetings in 2017 to discuss autonomous weapons. This Group of Governmental Experts (GGE) is the next step towards new international law about autonomous weapons. The international Campaign to Stop Killer Robots has a comment on the GGE decision online.
It’s been a busy week at CCW, Mines Action Canada delivered a statement in the General Debate and then we worked with our campaign colleagues to shore up support for the GGE.
So you didn’t miss out on any of the week’s events, we’ve created daily recaps in both Storify and video format. This week marks the start of a whole new phase of our efforts to ban killer robots. Donate today to support our work.
Thank you Chair. I appreciate the opportunity to speak on behalf of Mines Action Canada.
Although today we are starting the 5th Review Conference of the Convention on Conventional Weapons, we must spend our time looking forward. We are entrusted with preventing humanitarian harm from existing weapons like incendiary weapons and from future weapons that will require new legal instruments to avoid catastrophes to come.
CCW has spent three years holding informal meetings about autonomous weapons systems. At times during those discussions, we have felt that some have underestimated the skills, knowledge, intelligence, training, experience, humanity and morality that women and men in uniform combine with situational awareness and IHL to make decisions during conflict. We work closely with roboticists, and engineers, but despite their expertise and the high quality of their work, we do not believe an algorithm could replicate this complex and very human decision making process. Robotics should only be used to inform and supplement human decision making.
In the CCW’s work on autonomous weapons systems, we have learned more about Article 36 reviews but it is clear that states need to be more transparent, systemic and rigorous in their weapons review processes. Mines Action Canada believes that Article 36 weapons reviews should be a topic of discussion at the international level to strengthen both policy and practice around the world.
However, better weapons reviews will not solve the problems associated with autonomous weapons systems. For example, Article 36 reviews are not obligated to cover weapons used for domestic purposes outside of armed conflict such as policing, border control, or crowd control. Most importantly, weapons reviews cannot answer moral, ethical, technical and political questions. An Article 36 review cannot tell us if it is acceptable to the public conscience for a machine to kill without meaningful human control.
It is time for a separate effort to strengthen the standards and transparency around weapons reviews. That effort must neither distract from nor overtake our work here to deal with the real moral, legal, ethical and security problems associated with autonomous weapons systems. Weapons reviews must be grounded in new and robust international law that clearly and deliberately puts meaningful human control at the centre of all weapons development.
The concerns raised by autonomous weapons are urgent and must take priority. If we wait until everyone has a clear understanding of every aspect of the issue to start a Group of Governmental Experts the window of opportunity to prevent humanitarian harm from autonomous weapons will close. A GGE will allow high contracting parties to develop their understanding of the issue and to pursue effective outcomes.
In Canada, particularly, this year’s defence review offered an opportunity for the government to hear from a number of experts on autonomous weapons systems. A GGE next year would give Canada the opportunity to share the results of that process and to contribute our collective understanding of the issue.
Mines Action Canada, as a co-founder of the Campaign to Stop Killer Robots, believes that the way forward must lead to a pre-emptive ban on autonomous weapons systems as a tool to prevent humanitarian harm without damaging research and development on autonomy and robotics for military or civilian purposes. Earlier this year, a Canadian robotics expert made it clear there are no other applications for an autonomous system which can make a “kill or not kill” decision. The function providing an autonomous weapon the ability to make the “kill decision” does not have an equivalent civilian use therefore, pre-emptive ban on autonomous weapons systems would have no impact on the funding of research and development for artificial intelligence.
As experts at the meeting in April made clear our window of opportunity to prevent future humanitarian harm from autonomous weapons will not stay open long so we need to be moving forward at this Review Conference. Therefore, we urge states to accept the recommendation for an open-ended Group of Governmental Experts next year.
For the third year the Convention on Conventional Weapons (CCW) met in Geneva to work on autonomous weapon systems. From April 11th to 15th 2016, the informal experts meeting held at the United Nations addressed a wide range of issues and concerns. The meeting was attended by 94 member and observer States to the 1980 Convention on Conventional Weapons. This marked the highest turnout yet reflecting the increasing awareness and growing concern about these future weapons.
The CCW is a framework treaty that prohibits or restricts certain conventional weapons deemed to be excessively injurious or to have indiscriminate effects. The meeting included members of UN agencies, including the UN Institute for Disarmament Research, and the International Committee of the Red Cross (ICRC). The Campaign to Stop Killer Robots, with a diverse and passionate group of 40 campaigners from 10 countries, was active and provided very substantive input and expert analysis into discussions on issues such as meaningful human control and the weaknesses of Article 36 weapon reviews among other topics.
As a co-founder of the Campaign to Stop Killer Robots, Mines Action Canada (MAC) supports a pre-emptive ban on lethal autonomous weapons systems before they are ever developed or used. MAC was well represented at the meetings and gave statements about the repercussions of allowing such weapons to be developed. Executive Director Paul Hannon’s statement talked about the need to implement a ban on these systems before a humanitarian catastrophe occurs as well as the growing support for a pre-emptive ban. Project Coordinator Erin Hunt’s statement detailed the limitations of weapon reviews for lethal autonomous weapons systems and the importance of prioritizing a ban on these systems. Project Ploughshares, a new Canadian member of the Campaign to Stop Killer Robots, also participated actively.
Encouragingly, five nations Algeria, Chile, Costa Rica, Mexico, and Nicaragua called for a pre-emptive ban on lethal autonomous weapons systems bringing the total to 14 states that now support this goal. Throughout the meeting, Cuba, Ecuador, the Holy See and Pakistan as well as all NGO speakers reiterated the need for a ban on lethal autonomous weapons systems.
Although many nations, including Canada, were not willing at this time to support a ban on lethal autonomous weapons systems, the importance of meaningful human control was underlined many times during the 5-day meeting. The Netherlands announced new policy in support of meaningful human control over deployed weapon systems. As well, Austria made note of the recommendation by two UN Special Rapporteurs in their February 2016 report to prohibit lethal autonomous weapons systems that require no meaningful human control. The report can be found here with reference to lethal autonomous weapons systems on page 17.
The common area of concern regarding lethal autonomous weapons systems is the lack of ability to properly target and apply force in a strike. Without human control over key points of identification of targets and approval of a strike, innocent civilians could be inappropriately targeted and killed. It is important to maintain meaningful human control over critical functions of lethal autonomous weapons systems to ensure proper application of international humanitarian law as well as accountability were a target later to be understood as inappropriate. These qualifiers cannot be guaranteed with lethal autonomous weapons and thus threaten international security and stability.
The proliferation of lethal autonomous weapons systems risks the development of an arms race and a weakening of global security. Prohibitions and tight restrictions have helped to calm arms races and promote international peace. As well, lethal autonomous weapons systems threaten the protection of civilians in conflict zones. Autonomous robotic systems are most successful in predictable and reliable environments. However, autonomous weapons systems would be in highly unpredictable conflict zones and thus risk performing unreliably and endangering civilians.
NGOs remain concerned over the lack of concrete action taken on this issue. Strong and substantive statements were made by Mines Action Canada, Nobel Women’s Initiative, Human Rights Watch and others at CCW in April, urging states to establish an open-ended Group of Governmental Experts (GGE). A GGE would see in-depth study into the issues surrounding lethal autonomous weapons and submitting their findings to the UN. A GGE would be the first step in understanding issues of how lethal autonomous weapons systems would comply with international humanitarian law, the effects on regional or global security, create instability and the risk of an arms race. In addition, the GGE would give the CCW time to explore the blurred nature between soldier and weapon that fully autonomous weapon systems present. A short video compiled by the Campaign to Stop Killer Robots at the meeting in April can be seen here and daily updates can be found online for Monday, Tuesday, Wednesday, Thursday and Friday.
The meeting ended with some recommendations for the future and states ‘may decide to establish’ a GGE at the 2016 Fifth Review Conference of the High Contracting Parties to the Convention on Prohibition or Restrictions on the Use of Certain Conventional Weapons in December. In addition, the recommendations state that the GGE should work to identify characteristics of lethal autonomous weapons systems and create a working definition.
Although most interventions supported the formation of a GGE as the next step in this process, the final decision will not be made until the Review Conference in December. While Mines Action Canada is cautiously optimistic that a Group of Governmental Experts will be formalized at the meeting this December, we remain concerned that the pace of these discussions is not keeping up to the speed of technological change. Of course, the specific mandate of the GGE will be all important.
Much work remains over the coming months.
The Campaign to Stop Killer Robots released their report on the activities undertaken at the 3rd meeting of the Convention on Conventional Weapons and can be found here.
A student post by our summer student Miranda Barclay.
With the third and hopefully final Convention on Conventional Weapons (CCW) informal experts meeting coming up in a couple days, it is important to remind ourselves of what was discussed last year and what work still needs to be done.
The gathering of the CCW member states and organisations in Geneva in April 2015 was designed as a forum at which states could discuss the important technical, legal, moral and ethical issues surrounding autonomous weapons, otherwise known as ‘killer robots’.
At the 2015 meetings, almost all states that spoke agreed that further work is necessary and desirable and many expressed that no autonomous weapons should be allowed to operate without meaningful human control. Nor with human control that is ‘devoid of meaning.’ There were however a small number of states who were more reserved regarding the eventual achievement of a pre-emptive ban on autonomous weapons. The US and Israel implied that they plan to leave the door open for the future acquisition of these weapons. While France and the UK stated that they would not pursue killer robots but still neither indicated support for the logical conclusion of a pre-emptive ban.
Another important notion that arose from the CCW 2015 meetings was the fact that autonomous weapons or killer robots are not an inevitable piece of weaponry and should never be allowed to become an inevitable piece of weaponry. This notion was a useful counterpoint to some interventions that seemed to under-estimate to value and importance of human soldiers.
Further, the CCW focused heavily on norm creation, with members emphasising the need to establish norms in order to efficiently discuss and articulate what is most disturbing and threatening about the possibility of autonomous weapons use. Once these norms are clearly established and accepted by a majority of states, hopefully there will be a more concerted effort to transform these norms into fully ratified international laws.
Finally, multiple countries and organisations identified the need to define what exactly some of the key terms commonly used at the conference meant. For example, what exactly is meant by ‘meaningful human control’? Further explorations of this principle could be a key component of a Group of Governmental Experts in 2017 leading to a process to prevent the use of fully autonomous weapons through law.
Hopefully, this year some more solid definitions can be agreed upon and a Group of Governmental Experts will be called for next year so the process of banning autonomous weapons through international law can be accelerated leading to a pre-emptive ban.
Claudia Pearson is an undergraduate student at the University of Leeds, currently studying abroad at the University of Ottawa.
Executive Director Paul Hannon delivered our closing statement at the Convention on Conventional Weapons today. Download the statement here or read it below.
The Way Forward
Thank you Mr. Chair and your team for the strong foundation to move forward with the urgency and focus this issue requires. This week we have seen wide-ranging discussions on autonomous weapons systems. The CCW does not often enough deal with issues of morality, human rights and ethics. We welcome all states who have asserted the necessity of maintaining meaningful human control over the use of force. These conversations should continue and deepen.
There is one issue we would like to raise as food for thought. At times during the week, we have felt that some have underestimated the skills, knowledge, intelligence, training, experience, humanity and morality that men and women in uniform combine with situational awareness and IHL to make decisions during conflict. We work closely with roboticists, engineers, and technical experts and despite their expertise and the high quality of their work we do not believe an algorithm could replicate this complex decision making process. Robotics should only be used to inform and supplement human decision making. To go further than that risks “dehumanizing those we expose to harm” as RCW’s CCW Report’s editorial stated yesterday.
Allow me to conclude with the assertion that the international response to the possibility of autonomous weapons systems must not be limited to transparency alone. The expert presentations and the debates this week have strengthened our belief that autonomous weapons systems are not a typical new weapon and our current IHL and weapons review processes will not be sufficient. A mandate for a group of governmental experts next year is an appropriate and obvious next step. We look forward to working with the high contracting parties to ensure that meaningful human control remains at the centre of all decisions to use violent force.
Today at the Convention on Conventional Weapons meeting about lethal autonomous weapons systems, Mines Action Canada released a new memo to delegates on the impact of autonomous weapons systems on public trust in robotics. In this memo we discuss how the creation and use of autonomous weapons systems could change public perception of robotics more generally. Read the memo here and let us know what you think!
Will the use of killer robots make you more or less likely to want other autonomous robots in your life?