Author Archives: minesactioncanada
Delivered by Paul Hannon, Executive Director
Thank you Mr. Chairman. As a co-founder of the Campaign to Stop Killer Robots and a long-time advocate for humanitarian disarmament, Mines Action Canada supports the statement delivered by the Campaign’s Coordinator.
In many ways 2017 was a lost year for efforts to prohibit autonomous weapons here so we are hoping to see significant progress at the CCW in 2018.
Outside of these walls though, the conversation about autonomous weapons progressed at the end of 2017 and the start of 2018.
In November, over 200 Canadian Artificial Intelligence experts released an open letter to Prime Minister Justin Trudeau calling for Canadian leadership on autonomous weapons systems. These Canadian experts are still waiting for a response from the government of Canada. Similar national letters have been released in Australia and Belgium.
Two weeks ago the G7 Innovation Ministers released a Statement on Artificial Intelligence which cited the need to increase trust in AI and included a commitment to “continue to encourage research, including […] examining ethical considerations of AI.”.
This week should provide opportunity for states to share and expand on their positions with regards to autonomous weapons systems and the need for meaningful human control. States should not overlook the ethical, humanitarian and human rights concerns about autonomous weapons systems as we delve into some technical topics.
Mr. President, CCW protocols have a history of addressing the ethical and humanitarian concerns about weapons. Protocol IV on blinding laser weapons is particularly relevant to our discussions. As a pre-emptive prohibition on an emerging technology motivated by ethical concerns, Protocol IV has been very effective in preventing the use of an abhorrent weapon without limiting the development of laser technology for other purposes including other military purposes. It is important to note that Protocol IV has some of the widest membership of all the protocols including all five permanent members of the United Nations Security Council, all the states that have chaired the autonomous weapons talks here at the CCW and most of the states who have expressed views about autonomous weapons. All those states are party to a Protocol that banned for ethical reasons a weapon before it was ever deployed in conflict.
Above all, we hope that the states present this week will reflect on the concept of responsibility. The Government of Poland’s working paper which discusses this topic is a useful starting point. We see responsibility as a theme that runs throughout these discussions.
A Canadian godfather of Artificial Intelligence has often spoken of the need to pursue responsible AI. Responsible AI makes life better for society and helps “prevent the misuse of AI applications that could cause harm” as noted in the G7 Annex.
We have been entrusted with a great responsibility here in this room. We have the responsibility to set boundaries and prevent future catastrophes. We must be bold in our actions or we could face a situation where computer programmers become de facto policy makers.
Above all, as part of our collective humanity, we must remain responsible for our actions – we cannot divest control to one of our creations whether it is in our daily actions, or more crucially for this week’s discussion, in our decisions to use weapons.
In the past, those sitting in these seats have met their responsibility to “continue the codification and progressive development of the rules of international law applicable in armed conflict” by negotiating new protocols and in the case of blinding laser weapons a pre-emptive protocol. Now it is our turn and this is our issue to address.
Mines Action Canada and the Campaign to Stop Killer Robots have been busy talking about autonomous weapons this winter.
MAC Executive Director, Paul Hannon, traveled to Halifax to speak to the Canadian International Council’s (CIC) local AGM. In his talk, he shared the game plan to stop killer robots drawing on lessons from the Ottawa Treaty banning landmines. The CIC posted Paul’s accompanying blog post to this lecture which you can find online. The blog post states quite clearly it’s decision time for Canada on autonomous weapons.
“The third revolution in warfare is coming fast. Unlike most revolutions we know this one is coming. What is even more unusual is that we can stop this revolution before it starts. Before anyone is injured or killed. It will take a lot of political will by many countries including Canada. Do we have the will and more importantly the courage to use it?”
Mary Wareham, the Campaign to Stop Killer Robots’ coordinator, spoke to the prestigious Munich Security Conference in February. A public event on artificial intelligence and modern conflict organized by the conference saw common views emerge from different perspectives against weapons that, once activated, could identify, select and attack targets without further human intervention. The event opened with remarks by a “robot” and featured a panel where Mary spoke alongside the president of Estonia, a general from Germany, and a former head of NATO. The recap of that event is available on the global campaign’s website.
One of the Campaign to Stop Killer Robots’ co-founders Noel Sharkey of the International Committee for Robot Arms control will be speaking Halifax on March 21. Noel will debate Duncan MacIntosh, Professor of Philosophy at Dalhousie University, on the role of autonomous weapons and the question to what degree should we be concerned? More details are available here.
On March 28, Erin Hunt, Program Coordinator will join ThePANEL to discuss autonomous weapons and the campaign. The AI Arms Race: Should We Be Worried? brings together experts from Canada and the U.S. to debate the impact of AI on global politics and human rights. Tickets are available online.
Wherever we are talking to the public about autonomous weapons, one thing is clear: Canadians, like others around the world, are expecting their government to come up with a plan to prevent the development of autonomous weapons soon. In order to make that happen, MAC and the Campaign to Stop Killer Robots are working hard in preparation for the Group of Governmental Experts meeting in Geneva in April.
Delivered by: Erin Hunt, Programme Coordinator
Thank you Mr. Chair. As a co-founder of the Campaign to Stop Killer Robots, Mines Action Canada is very conscious of public opinion and the public conversation concerning autonomous weapons systems. Recently, autonomous weapon systems have been in the news in Canada. Last week, over 200 Canadian Artificial Intelligence experts released an open letter to Prime Minister Justin Trudeau calling for Canadian leadership on the issue. The letter states [quote] “Lethal autonomous weapons systems that remove meaningful human control from determining the legitimacy of targets and deploying lethal force sit on the wrong side of a clear moral line.” [unquote]
Copies of this letter can be found at the back of the room. It is not only in Canada where the AI research community is speaking out – a similar letter was also released in Australia. As mentioned by my colleague, since the last time the CCW met a letter from over 100 founders of robotics and artificial intelligence companies calling for a preemptive ban on autonomous weapons was also released. Additional national letters are in the works.
These public letters show that concerns about possible negative impacts of a pre-emptive ban are misplaced as ICRAC made clear moments ago and what the research community is calling for is bold and decisive action.
Mines Action Canada appreciates the significant number of expert presentations we have had this week but we hope that states will take time to share their views substantially over the remaining days.
From states who say Article 36 review may be sufficient to deal with our concerns about autonomous weapons systems, we hope to hear how an Article 36 review would be able to assess bias in the data used in machine learning and how comportment with IHL would be ensured by systems that continue to learn after the review.
In light of persistent statements from some delegations that they are uncertain about what we are talking about here, we hope to hear states share their current understanding of autonomous weapons systems. Specific definitions are not needed at this stage but we believe there is more clarity and consensus on these questions than one may think.
We would like to hear more on next steps from states who are calling for a pre-emptive ban. Mines Action Canada would welcome concrete discussions on how to ensure that momentum is not lost on this issue. We lost a week of work in August but as I mentioned at the beginning of my statement, the public conversation about autonomous weapons continues to advance and the people at home expect us to make progress.
This week it is important to continue to build on the work done in the past and to ensure that further discussions take place in 2018. Administrative challenges do not lessen “the need to continue the codification and progressive development of the rules of international law applicable in armed conflict” that is reaffirmed in the Preamble of this Convention. The technology is rapidly advancing and so must our conversations here.
Mines Action Canada welcomes the letter calling for a ban on the weaponization of Artificial Intellegence (AI) from the Canadian AI research community which was sent to Prime Minister Justin Trudeau. This letter follows a number of international letters in recent years (from faith leaders, scientists, Nobel laureates, company founders and others) addressed either to the UN or the global community in support of actions to prevent the development of autonomous weapons.
“This letter is evidence that Canadian AI community wants to see leadership from Canada,” said Paul Hannon, Executive Director, Mines Action Canada. “Clearly Canada should become the 20th country to call for a pre-emptive ban on autonomous weapons and to lead a process to ensure that autonomous weapons systems never arrive on the battlefield.”
More than 200 AI researchers in Canada signed the open letter to the Prime Minister “calling on you and your government to make Canada the 20th country in the world to take a firm global stand against weaponizing AI. Lethal autonomous weapons systems that remove meaningful human control from determining the legitimacy of targets and deploying lethal force sit on the wrong side of a clear moral line.”
The letter goes further asking “Canada to announce its support for the call to ban lethal autonomous weapons systems at the upcoming United Nations Conference on the Convention on Certain Conventional Weapons (CCW). Canada should also commit to working with other states to conclude a new international agreement that achieves this objective.” One of the letter’s authors, Dr. Ian Kerr of the University of Ottawa wrote an op-ed in the Globe and Mail bringing the letter’s message to Canadians from coast to coast to coast.
Dr. Kerr notes, “it is not often that captains of industry, scientists and technologists call for prohibitions on innovation of any sort, let alone an outright ban. But the Canadian AI research community is clear: We must not permit AI to target and kill without meaningful human control. Playing Russian roulette with the lives of others can never be justified. The decision on whether to ban autonomous weapons goes to the core of our humanity.”
This letter has been released one week before the international community meets under the auspices of the CCW to discuss the issue of autonomous weapons systems. Mines Action Canada’s Programme Coordinator, Erin Hunt will be attending the meeting next week in Geneva. She said “in past discussions at the CCW, some states have expressed concern that a prohibition on autonomous weapons systems would have a negative impact on AI research more broadly. This letter and the similar one released by Australian AI experts show that those concerns are misplaced. The AI research community is calling for the opposite – bold and decisive action to prohibit autonomous weapons systems in order to support the development of AI that would benefit humanity.”
Our popular Keep Killer Robots Fiction T-shirts have been re-launched and new items are available. For a limited time only you can get your Keep Killer Robots Fiction t-shirt in three styles as well as Keep Killer Robots Fiction mugs and tote bags. Visit www.teespring.com/keepkillerrobotsfiction2017 to purchase yours today.
The t-shirts, totes and mugs are only available until November 6th so order today!
Advocates of a ban on killer robots can learn from the new Treaty on the Prohibition of Nuclear Weapons
On July 7 2017, 122 states adopted a new treaty prohibiting nuclear weapons at the United Nations in New York. The Treaty on the Prohibition of Nuclear Weapons bans the development, possession, stockpiling, transfer, use and threat of use of nuclear weapons while also requiring states to assist victims of nuclear weapons use and testing as well as to remediate affected environments.
This treaty came about after years of work by states and civil society. In the face of disarmament efforts that had been stalled for decades, civil society and like-minded states reframed nuclear disarmament from an arms control question to a humanitarian issue. The humanitarian framing was a key driver in the Humanitarian Initiative that led to the negotiations just as it had been for the processes prohibiting anti-personnel landmines and cluster munitions.
Just as advocates for nuclear disarmament were able to learn from previous campaigns to ban landmines and cluster munitions, future disarmament campaigns can learn many lessons from this process leading up to the Treaty’s adoption. In particular, it can provide a number of important lessons for civil society and states working towards a pre-emptive ban on autonomous weapons systems.
First, the nuclear ban process demonstrated that it is much easier to prohibit weapons before they are used. It took 71 years, 11 months and 1 day to adopt a treaty prohibiting nuclear weapons after their first use in Hiroshima, Japan. In that time, nine other states developed nuclear weapons (South Africa later dismantled its program), nuclear weapons were tested over 1,000 times and the global nuclear arsenal ballooned to over 60,000 before dropping to the current total of approximately 15,000. We also cannot forget that the first United Nations resolution ever agreed to, back in 1946, was in support of nuclear disarmament. Over the past seven decades, countless hours and dollars have been spent trying to prohibit and eliminate nuclear weapons.
When it comes to autonomous weapons, a similar time frame could result in countless casualties and unimaginable humanitarian harm. Looking back now on what it has taken to get to this place regarding nuclear weapons and how much further there is go, the only logical conclusion is that we should have prohibited nuclear weapons in 1945 – it would have saved many lives and significant amounts of money.
Second, conversations about the humanitarian impact, the legality and morality of weapons are useful even when others want to talk about “hard security” because they open space both for conversations and for other actors. Nuclear disarmament has often been characterized as the ultimate hard security topic – one that has be talked about in “serious” state security and arms control terms. However, the process that led to the nuclear ban treaty framed the conversation around the humanitarian impact of nuclear weapons.
By placing human security at the centre of these discussions, the humanitarian initiative created new space to consider the prohibition of nuclear weapons. All states had a stake in the negotiations because of the trans-border nature of the humanitarian consequences of nuclear weapons. Whether they were part of a Nuclear Weapons Free Zone or neighbor to a nuclear armed state, all states who participated saw that nuclear weapons were a threat to their citizens. Prioritizing the security of citizens changed the conversation and a similar shift could be expected within the discussion of autonomous weapons systems.
The morality of nuclear weapons also figured heavily in the Humanitarian Initiative. Faith groups especially highlighted the morally unacceptable basis of nuclear deterrence. The moral and ethical issues surrounding autonomous weapons systems have been a topic of conversation at the national and international level from the start and this should continue. As humanity we cannot overlook questions about the morality and ethics of weapons. It is often the moral and ethical arguments that motivate states to disavow weapons that may have some perceived military utility.
Third, being rational and realistic is not exclusive to those focusing on traditional state security. Disarmament advocates and states supporting the nuclear ban treaty had realistic and security focused reasons for pursuing nuclear disarmament and the treaty, however, they were dismissed as being driven by emotions and values not rational thought. People who claim to be “rational” or “realistic” because they are talking about state security are just as emotionally and values driven as those taking a humanitarian approach. Critics of civil society and of the ban treaty process often accused them of being emotionally driven while those supporting the status quo were considered “rational” and “realistic”. Beyond the gendered narrative constructed by these assertions, the important thing to remember is that even those who are focusing on hard or state security are motivated by values. The values may be different, but that does not preclude that they are emotionally driven.
Fourth, treaties that construct norms can have an impact. The Treaty Prohibiting Nuclear Weapons has been called a constructivist treaty because it seeks to further construct the international norm against nuclear weapons. The nuclear weapon states’ reaction to the treaty shows that such constructivist treaties can have an impact. Treaties are only legally binding on the states which the join them, however, it is evident that the normative or constructivist aspects of the treaty will have a wide-ranging impact. Based on the reactions to the nuclear ban treaty by the nuclear weapons states especially the United Kingdom, France and the United States, they expect that the norms set by the Treaty will have an impact on global perception of nuclear weapons and their self-perceived status as legitimate possessors of nuclear weapons.
Fifth, diplomats and governments should listen to scientists. After the development of the first nuclear weapons, scientists from a variety of disciplines appealed to world leaders to “remember your humanity, and forget the rest” and end the threat of use of nuclear weapons. That appeal went unheard and decades later, over 3,000 scientists signed an open letter in support of the 2017 negotiations. The scientists should have been listened to in the 1940s. Now, we see thousands of AI experts, roboticists, computer scientists and others speaking out and calling for a pre-emptive ban on autonomous weapons systems. Many of these experts do not want to see their work corrupted into harming people. Their concerns are serious and should be treated as such.
Sixth, colonial views still permeate international affairs. For decades the only opinions that seemed to matter about nuclear weapons were those of the five nuclear weapon states (the US, the UK, France, Russia and China) and occasionally their allies. The security concerns of the nuclear armed states were paramount over any concerns by the vast majority of the planet. Even after the Humanitarian Initiative began to level the playing field or as Costa Rica put it, “democracy has come to nuclear disarmament”, the nuclear armed states still continued to question the validity of other states’ concerns. During informal comments in the United Nations General Assembly’s First Committee in 2016, the United Kingdom’s ambassador stated that the states pressing for a nuclear ban treaty did not have security concerns unlike the states who opposed the negotiations which did. It goes without saying that all states have security concerns.
These colonialist hangovers also had an impact on who were considered acceptable guardians of weapons capable of destroying humanity. Built into the conversations about nuclear weapons is the idea that they are bad for most states to have, but that a select few states can have them without it being a problem. However, as the Humanitarian Initiative has shown there are no safe or responsible hands for nuclear weapons. The risk of another nuclear detonation, intentional or not, is far too high.
Colonialist views can be seen in the assumption that only “we” will have autonomous weapon systems and the reasons “we” might need them are more valid than reasons for opposing them. In the nuclear ban context, these views were overcome by a humanitarian approach and by meetings which were open to all states and blockable by none. Unlike a number of other United Nations bodies that deal with nuclear disarmament, the 2016 Open Ended Working Group and the 2017 negotiating conference were open to all states as well as international organizations and civil society. These meetings originated in the United Nations General Assembly which provided rules of procedure predicated on equality between all participating states. International efforts to pre-emptively prohibit autonomous weapons systems should aim for similar levels of openness and inclusivity to ensure that all states have a voice on this issue that will affect us all.
These lessons from the nuclear ban treaty process reinforce lessons from the Ottawa Treaty and from the Convention on Cluster Munitions. Lessons from the Ottawa Treaty showed that the international community should not wait until there is a humanitarian crisis with huge stockpiles and thousands of casualties and the Treaty on the Prohibition of Nuclear Weapons puts that lesson in to practice. Other lessons from the Ottawa Treaty such as action is possible even when large states are not participating and the importance of including survivors in the process were applied successfully in the nuclear ban process. Learning from and building on previous humanitarian disarmament processes will make future efforts much more likely to succeed.
The Humanitarian Initiative and the Treaty on the Prohibition of Nuclear Weapons have provided a number of lessons for campaigners and states seeking to pre-emptively prohibit autonomous weapons systems. More importantly perhaps this process has shown it is possible to create new international law even when some states are strongly opposed to the idea. The power of civil society and like-minded states to create change should not be underestimated.
The Campaign to Stop Killer Robots is deeply disappointed that the Convention on Conventional Weapons (CCW) has cancelled a crucial week of formal discussions on fully autonomous weapons in August. This step was taken because of the failure of several states, most notably Brazil, to pay their assessed dues for the convention’s meetings.
“The collective failure of countries to find a solution to their financial woes doesn’t mean they can stop addressing concerns over weapons that would select and attack targets without further human intervention” said Mary Wareham of Human Rights Watch, coordinator of the Campaign to Stop Killer Robots. “If the CCW is unable to act, nations must find other ways to maintain the momentum toward a ban,” she said. “Countries that agree with the need to retain human control of weapons systems should move swiftly to adopt national policies and laws and to negotiate a new international treaty prohibiting fully autonomous weapons.”
The call for a preemptive ban on fully autonomous weapons has been endorsed by 19 countries and dozens more states have affirmed the need to retain human control over the selection of targets and use of force. This clearly indicates that they see a need to prevent the development of fully autonomous weapons. Last December, China became the first permanent member of the UN Security Council to find that new international law is required to regulate fully autonomous weapons.
The Campaign calls on Canada and all countries to urgently address the enormous humanitarian challenges posed by these weapons by endorsing the call for a ban. It is vital and urgent that all stakeholders work together to secure a new international treaty before these weapons are unleashed.
“Canada has a long history of taking action when the CCW is unable to move forward,” said Paul Hannon, Executive Director of Mines Action Canada, a co-founder of the Campaign to Stop Killer Robots. “We are calling on Canada to act now to ensure that there is always meaningful human control over weapons. The international community cannot let the work done thus far go to waste.”
The Campaign to Stop Killer Robots fundamentally objects to permitting machines to take a human life on the battlefield or in policing, border control, and other circumstances. It calls for a preemptive ban on fully autonomous weapons through new international law as well as through domestic legislation.
Following the launch of the Campaign to Stop Killer Robots and a debate in the Human Rights Council, countries agreed in November 2013 to begin discussing what they called lethal autonomous weapons systems at the Convention on Conventional Weapons at the United Nations in Geneva. The CCW is a framework treaty that prohibits or restricts certain weapons and its 1995 protocol on blinding lasers is an example of a weapon being preemptively banned before it was acquired or used.
Most of the CCW’s 124 high contracting parties participated in three meetings on lethal autonomous weapons systems in 2014-2016, in addition to UN agencies, the International Committee of the Red Cross, and the Campaign to Stop Killer Robots. Last December at their Fifth Review Conference CCW states decided to formalize and expand those deliberations by establishing a Group of Governmental Experts on lethal autonomous weapons systems to meet in August and November 2017, chaired by Ambassador Amandeep Singh Gill of India.
However, on 30 May, the CCW’s president-designate Ambassador Matthew Rowland of the UK announced that the Group of Governmental Experts meeting scheduled for 21-25 August has been cancelled due to a lack of funds. Previously Rowland issued several warnings that that the lack of payment of assessed financial contributions would mean the likely cancellation of CCW meetings planned for 2017.
Several countries have financial arrears from previous years, but according to the UN’s official summary, Brazil accounts for 86 percent of the outstanding contributions due to four core humanitarian disarmament treaties, including the CCW. Brazil last paid its assessed CCW contributions in 2010. The Campaign to Stop Killer Robots has appealed to Brazil to pay its outstanding contributions without delay and it challenges CCW states to achieve cost saving measures in other ways that do not require the cancellation of key meetings.
Several autonomous weapons systems with various degrees of human control are currently in use by high-tech militaries including CCW states the US, China, Israel, South Korea, Russia, and the UK. The concern is that low-cost sensors and advances in artificial intelligence are making it increasingly practical to design weapons systems that would target and attack without any meaningful human control. If the trend towards autonomy continues, humans will start to fade out of the decision-making loop, first retaining only a limited oversight role, and then no role at all.
Canada, France, UK, and the US supported establishing the CCW Group of Governmental Experts last December, but remain unambitious in their overall goals for the process by proposing a focus on sharing best practices and achieving greater transparency in the conduct of legal reviews of new weapons systems. Russia openly opposed the creation of a Group of Governmental Experts, but did not block multilateral consensus for establishing one.
New poll indicates that 55% of Canadians oppose autonomous weapons systems
This week, Ipsos released results from the first global public opinion survey that included a question on autonomous weapons. Autonomous weapons, sometimes called killer robots, are future weapons that could select and fire upon a target without human control. Ipsos found that 55% of Canadians surveyed opposed autonomous weapons while another 25% were uncertain about the technology.
In the survey 11,500 citizens across 25 countries were asked “The United Nations is reviewing the strategic, legal and moral implications of autonomous weapons systems. These systems are capable of independently selecting targets and attacking those targets without human intervention; they are thus different than current day ”drones” where humans select and attack targets. How do you feel about the use of autonomous weapons in war?” In all but five countries (France, India, the US, China and Poland), a clear majority are opposed to the use of autonomous weapons in war.
Among Canadians, 21% of respondents reported being somewhat opposed to autonomous weapons in war while 34% were strongly opposed to the technology being used in war. Only 5% of Canadians surveyed were strongly supportive of using autonomous weapons in war. This survey is the first to poll Canadians on autonomous weapons systems.
“As part of the Campaign to Stop Killer Robots, we have frequently heard from Canadians that they want to ensure that there is meaningful human control over weapons at all times. This survey confirms that those opinions represent the majority of Canadians,” said Paul Hannon, Executive Director of Mines Action Canada, a co-founder of the Campaign to Stop Killer Robots, “In addition to strong citizen opposition to the use of autonomous weapons in war, Canada also has the first robotics company in the world to vow to never build autonomous weapons, Clearpath Robotics. It is time for the Canadian government to catch up to the citizens and come up with a national policy on autonomous weapons.”
Mines Action Canada is calling on the Government of Canada to ensure meaningful public and Parliamentary involvement in drafting Canada’s national position on autonomous weapons systems prior to the United Nations talks on the subject later this year.
– 30 –
Media Contact: Erin Hunt, Program Coordinator, Mines Action Canada, + 1 613 241-3777 (office), + 1 613 302-3088 (mobile) or firstname.lastname@example.org.
 Argentina, Belgium, Mexico, Poland, Russia, Saudi Arabia, South Africa, South Korea, Sweden, Turkey, Hungary, Australia, Brazil, Canada, China, France, Germany, Great Britain, India, Italy, Japan, Spain, Peru and the United States of America.
Originally published on the Forum on the Arms Trade’s Looking Ahead blog, Erin Hunt looks at opportunities and challenges ahead in 2017 for efforts to preemptively ban autonomous weapons systems.
2017 has the potential to be a pivotal year in efforts to ensure that all weapons have meaningful human control. For three years, the Convention on Conventional Weapons (CCW) has been discussing lethal autonomous weapons (future weapons that could select and fire upon a target without human control). In December 2016, the Review Conference of the CCW decided to establish a Group of Governmental Experts (GGE) chaired by Ambassador Amandeep Singh Gill of India which will meet over 10 days in 2017 and then report-back to the CCW’s annual meeting on 22-24 November.
A GGE is a more formal level of meetings than the ones held in 2014, 2015 and 2016. States will be expected to bring their own experts and participate actively in discussions, instead of listening to presentations by outside experts and asking questions of those experts. The first meeting of the GGE will be held at the UN in Geneva on either 24-28 April or 21-25 August 2017. The date is dependent on when funds are available for the meeting. The second meeting of the GGE will be on 13-17 November, just before the annual CCW meeting.
In 2016, the number of states calling for a pre-emptive ban on fully autonomous weapons more than doubled. At the time of writing, Algeria, Argentina, Bolivia, Chile, Costa Rica, Cuba, Ecuador, Egypt, Ghana, Guatemala, Holy See, Mexico, Nicaragua, Pakistan, Panama, Peru, State of Palestine, Venezuela and Zimbabwe have called for a ban while a number of other states seem to support new international humanitarian law of some sort to deal with autonomous weapons systems.
This GGE is a large step towards a pre-emptive ban on autonomous weapons systems but there are a number of challenges ahead in 2017. First, the Russian Federation continues to object to more formal talks on autonomous weapon systems on the grounds that it is premature to move forward since there is not a clear understanding of the subject under discussion. That objection forgets that definitions are usually the last part of disarmament treaties to be negotiated. It was only at the very end of the 2016 CCW Review Conference that Russia agreed to not block the GGE.
Second, the majority of states, including my own, Canada, do not have national policies on autonomous weapons systems. However, this challenge is also an opportunity. The Campaign to Stop Killer Robots will be working hard around the world in 2017 to support the development of national policies on autonomous weapons systems. After three years of informal CCW experts meetings as well as discussions in the Human Rights Council, states have a large amount of information at their disposal to begin to craft national policies. States can also hold consultations on creating a national policy in advance of the GGE meetings.
Third, there is the possibility that the GGE may become distracted by the inclusion of a discussion item on best practices and greater transparency in Article 36 weapons reviews. These legal reviews are an obligation of states developing, purchasing or otherwise acquiring new weapons.
Although Article 36 weapons reviews should be a topic of discussion at the international level to strengthen both policy and practice around the world, better weapons reviews will not solve the problems associated with autonomous weapons systems and should not distract the GGE from the core of its work. Weapons reviews cannot answer moral, ethical, and political questions. An Article 36 review cannot tell us if it is acceptable to the public conscience for a machine to kill without meaningful human control. Autonomous weapons systems are often referred to as a revolution in warfare; and as such, moral, ethical and political considerations must not be pushed aside. These questions need to remain on the international agenda in 2017.
This year, we will witness significant work done at the national and international level to increase understanding of the challenges posed by autonomous weapons as well as the number of states calling for a pre-emptive ban. Stay tuned to see if the international community stands ready at year’s end to ensure that all weapons have meaningful human control.
The 5th Review Conference of the Convention on Conventional Weapons (CCW) wrapped up today in Geneva and we’re very pleased that states agreed to hold two weeks of formal meetings in 2017 to discuss autonomous weapons. This Group of Governmental Experts (GGE) is the next step towards new international law about autonomous weapons. The international Campaign to Stop Killer Robots has a comment on the GGE decision online.
It’s been a busy week at CCW, Mines Action Canada delivered a statement in the General Debate and then we worked with our campaign colleagues to shore up support for the GGE.
So you didn’t miss out on any of the week’s events, we’ve created daily recaps in both Storify and video format. This week marks the start of a whole new phase of our efforts to ban killer robots. Donate today to support our work.