Statement delivered by Paul Hannon, Mines Action Canada, at the Group of Governmental Experts Meeting in March 2019
Thank you Chairperson
On Monday a delegation mentioned the Convention on Cluster Munitions. Many experiences with that life-saving treaty have relevance to this week’s discussions. Not the least of which is the importance of leaving definitions until you are at the negotiating table.
For many years the refrain by a small number of states here in the CCW was that existing IHL was sufficient to cover cluster munitions, but the large number of states who had a different view found a new pathway to address their concerns. For those who are not yet a State Party the Convention on Cluster Munitions the treaty was negotiated in 2008 and bans an indiscriminate, inhumane and unreliable weapon system that cause unacceptable harm to civilians. It entered into force and became a new part of International Humanitarian Law on August 1st 2010. To date 120 states have joined the CCM and even in those countries that have not yet joined, the treaty has impacts, for example, weapons manufacturers and financial institutions which have ceased to produce or invest in cluster munitions due to the reputational risk and the effects of such risk on their company’s bottom line.
Last week the latest government announced the destruction of their entire stockpile of cluster munitions joining a long list of States Parties which have destroyed their stockpiles including many of those states who spoke this week or in previous GGE sessions on weapons reviews.
Presumably, many of those states that had acquired cluster munitions did Article 36 weapons reviews, using existing IHL at the time. I say presumably because information on these reviews is not publically available.
Yet having undertaken their weapons reviews, then acquiring the weapon, and in some cases using it, they ended up agreeing that existing IHL at that time did need a new instrument specific to that weapon and they then joined the new cluster munitions treaty.
IHL is not static. Since the CCW negotiated Protocol IV, a preventive treaty for a weapon that had never been deployed, five other additions have been made to IHL illustrating that it does evolve and through new legally binding instruments becomes even stronger.
Mines Action Canada believes all states should undertake national weapons reviews before acquiring any weapon. We also believe current processes can be more robust and transparent. Public trust in the viability and acceptability of weapons is very important. Mines Action Canada would be pleased to assist any international efforts to improve these reviews, but that is not the work of this GGE.
Compared to autonomous weapons systems cluster munitions are technologically a much more straightforward weapon. It is hard, therefore, to reconcile the fact that existing IHL was not sufficient for cluster munitions and required a new addition to IHL, but somehow it is sufficient for a new weapon system with emerging, unproven, untested, and unpredictable technologies.
Our experience with the cluster munitions and other weapons leads to the inevitable conclusion that from a humanitarian perspective Article 36 weapons reviews on their own are insufficient to protect civilian populations from autonomous weapons systems.
New international law is needed to address the multitude of concerns with autonomous weapon systems. We believe this is possible by 2020 and urge High Contracting Parties to agree to a negotiating mandate in November.
Chairperson, the ICRC is well known for reminding us that even wars have limits. Mines Action Canada believes that the same applies to autonomy in weapons. Even with autonomy there are limits. It is time to negotiate another legally binding instrument, either here or elsewhere, for three key reasons: firstly to protect civilians; secondly to ensure that research and development of the beneficial uses of these new technologies continues and are not tainted by the stigmatizing impact of fully autonomous weapons; and, finally to come to a common agreement on how retaining meaningful human control will help define those limits to autonomy.
The Campaign to Stop Killer Robots has hired their first full time staff person. Isabelle Jones is the Campaign to Stop Killer Robot’s Project Officer based in Ottawa with Mines Action Canada. As she gets settled into her new role, we sat down to chat.
MAC: You have an educational and work background in human rights and global development. When (and why) did you become interested in disarmament issues?
IJ: In my fourth year of undergraduate studies in Global Development, I moved towards focusing my studies on the intersection of development and conflict – how development happens or stalls in complex contexts, fragile regions, and in the post-conflict period. In one of my classes we watched a clip from a documentary on the impact that landmines have had – and continue to have – in Cambodia. I was already a little familiar with landmines and landmine action after participating in a Red Cross presentation on the topic, but watching that documentary it seemed to click that these weapons weren’t just devastating at the point of detonation, but could continue to impact the development of communities, regions, and even countries long after conflict ends.
After class, some quick Internet searching led me to the International Campaign to Ban Landmines (ICBL), and then the Mines Action Canada (MAC) website. Learning about MAC’s work and their youth internship program, I decided that the 8-month internship would be the perfect post-graduation work opportunity. I could take the year to learn more about humanitarian disarmament and the long process of recovery that follows conflict, and then apply to grad school. Unfortunately, timing was not on my side. The start date for the program shifted and I wouldn’t complete my program in time to be eligible, but that interest in disarmament work never went away
MAC: And your interest in weapons technology?
IJ: I started thinking more and more about weapons technology. How has military technology, and the militarization of technology evolved since the laws of war were codified? How does this impact the lives and rights of civilians? And what does it say about how society views and values the human cost of war? I applied for my Master’s program with a proposal to research the use of drone technology in international and non-international armed conflicts, and the implications of this technology for international human rights and international humanitarian law. Over the course of my research my focus shifted slightly, and ultimately my dissertation argued that drone technology is deployed within a modern, bureaucratized system of labour – an institutional structure that can condition, shape and blind people to partake in morally, ethically and legally questionable acts of violence.
MAC: How did you learn about the Campaign to Stop Killer Robots?
IJ: Several members of the Campaign to Stop Killer Robots, like Article 36, HRW, ICRAC and PAX, have written and published research on armed drones, so I came across them in my dissertation research. This led me to learn about the work of the campaign, which I continued to follow throughout my studies and after their completion. I saw the proliferation of armed drones as a precursor to the lethal autonomous weapons systems that the campaign works to prohibit, and agreed with the campaign’s stance that it is essential to maintain human control over combat weapons. I have followed the work of the campaign closely and am honoured to be joining such a dedicated, passionate team of campaigners!
MAC: You will be working out of the Mines Action Canada office. What do you know about MAC’s work in humanitarian disarmament?
IJ: For decades MAC has been a leader in the global disarmament community, playing key roles in the International Campaign to Ban Landmines, Cluster Munition Coalition and (of course) the Campaign to Stop Killer Robots. Working nationally and internationally, MAC seeks to eliminate the consequences of indiscriminate weapons – weapons that cannot differentiate between civilians and combatants, and legitimate or illegitimate targets. This is the work that first sparked my interest and passion in humanitarian disarmament. After first hoping to become a MAC intern all those years ago, I am thrilled to now be working out of the Mines Action Canada office.
MAC: What are you most looking forward to in your new job?
IJ: I am most looking forward to the variety of the work. There is something very exciting about working in an environment where every day is a little different and there are always new challenges and opportunities to learn landing on your desk – which I think is part of the nature of working on a campaign that is small on staff, big on goals!
MAC: What do you like doing in your spare time?
IJ: In my spare time I love getting outdoors − camping, hiking, canoeing, scuba diving – and exploring new places through travel. Next on my travel bucket list is hiking in the Patagonia region of Chile. I am also an avid reader and you can often find me curled up on the couch with a new book, or re-reading one of my favourites for the umpteenth time.
Delivered by Paul Hannon, Executive Director
Thank you Mr. Chairman. As a co-founder of the Campaign to Stop Killer Robots and a long-time advocate for humanitarian disarmament, Mines Action Canada supports the statement delivered by the Campaign’s Coordinator.
In many ways 2017 was a lost year for efforts to prohibit autonomous weapons here so we are hoping to see significant progress at the CCW in 2018.
Outside of these walls though, the conversation about autonomous weapons progressed at the end of 2017 and the start of 2018.
In November, over 200 Canadian Artificial Intelligence experts released an open letter to Prime Minister Justin Trudeau calling for Canadian leadership on autonomous weapons systems. These Canadian experts are still waiting for a response from the government of Canada. Similar national letters have been released in Australia and Belgium.
Two weeks ago the G7 Innovation Ministers released a Statement on Artificial Intelligence which cited the need to increase trust in AI and included a commitment to “continue to encourage research, including […] examining ethical considerations of AI.”.
This week should provide opportunity for states to share and expand on their positions with regards to autonomous weapons systems and the need for meaningful human control. States should not overlook the ethical, humanitarian and human rights concerns about autonomous weapons systems as we delve into some technical topics.
Mr. President, CCW protocols have a history of addressing the ethical and humanitarian concerns about weapons. Protocol IV on blinding laser weapons is particularly relevant to our discussions. As a pre-emptive prohibition on an emerging technology motivated by ethical concerns, Protocol IV has been very effective in preventing the use of an abhorrent weapon without limiting the development of laser technology for other purposes including other military purposes. It is important to note that Protocol IV has some of the widest membership of all the protocols including all five permanent members of the United Nations Security Council, all the states that have chaired the autonomous weapons talks here at the CCW and most of the states who have expressed views about autonomous weapons. All those states are party to a Protocol that banned for ethical reasons a weapon before it was ever deployed in conflict.
Above all, we hope that the states present this week will reflect on the concept of responsibility. The Government of Poland’s working paper which discusses this topic is a useful starting point. We see responsibility as a theme that runs throughout these discussions.
A Canadian godfather of Artificial Intelligence has often spoken of the need to pursue responsible AI. Responsible AI makes life better for society and helps “prevent the misuse of AI applications that could cause harm” as noted in the G7 Annex.
We have been entrusted with a great responsibility here in this room. We have the responsibility to set boundaries and prevent future catastrophes. We must be bold in our actions or we could face a situation where computer programmers become de facto policy makers.
Above all, as part of our collective humanity, we must remain responsible for our actions – we cannot divest control to one of our creations whether it is in our daily actions, or more crucially for this week’s discussion, in our decisions to use weapons.
In the past, those sitting in these seats have met their responsibility to “continue the codification and progressive development of the rules of international law applicable in armed conflict” by negotiating new protocols and in the case of blinding laser weapons a pre-emptive protocol. Now it is our turn and this is our issue to address.
Delivered by: Erin Hunt, Programme Coordinator
Thank you Mr. Chair. As a co-founder of the Campaign to Stop Killer Robots, Mines Action Canada is very conscious of public opinion and the public conversation concerning autonomous weapons systems. Recently, autonomous weapon systems have been in the news in Canada. Last week, over 200 Canadian Artificial Intelligence experts released an open letter to Prime Minister Justin Trudeau calling for Canadian leadership on the issue. The letter states [quote] “Lethal autonomous weapons systems that remove meaningful human control from determining the legitimacy of targets and deploying lethal force sit on the wrong side of a clear moral line.” [unquote]
Copies of this letter can be found at the back of the room. It is not only in Canada where the AI research community is speaking out – a similar letter was also released in Australia. As mentioned by my colleague, since the last time the CCW met a letter from over 100 founders of robotics and artificial intelligence companies calling for a preemptive ban on autonomous weapons was also released. Additional national letters are in the works.
These public letters show that concerns about possible negative impacts of a pre-emptive ban are misplaced as ICRAC made clear moments ago and what the research community is calling for is bold and decisive action.
Mines Action Canada appreciates the significant number of expert presentations we have had this week but we hope that states will take time to share their views substantially over the remaining days.
From states who say Article 36 review may be sufficient to deal with our concerns about autonomous weapons systems, we hope to hear how an Article 36 review would be able to assess bias in the data used in machine learning and how comportment with IHL would be ensured by systems that continue to learn after the review.
In light of persistent statements from some delegations that they are uncertain about what we are talking about here, we hope to hear states share their current understanding of autonomous weapons systems. Specific definitions are not needed at this stage but we believe there is more clarity and consensus on these questions than one may think.
We would like to hear more on next steps from states who are calling for a pre-emptive ban. Mines Action Canada would welcome concrete discussions on how to ensure that momentum is not lost on this issue. We lost a week of work in August but as I mentioned at the beginning of my statement, the public conversation about autonomous weapons continues to advance and the people at home expect us to make progress.
This week it is important to continue to build on the work done in the past and to ensure that further discussions take place in 2018. Administrative challenges do not lessen “the need to continue the codification and progressive development of the rules of international law applicable in armed conflict” that is reaffirmed in the Preamble of this Convention. The technology is rapidly advancing and so must our conversations here.
The Campaign to Stop Killer Robots is deeply disappointed that the Convention on Conventional Weapons (CCW) has cancelled a crucial week of formal discussions on fully autonomous weapons in August. This step was taken because of the failure of several states, most notably Brazil, to pay their assessed dues for the convention’s meetings.
“The collective failure of countries to find a solution to their financial woes doesn’t mean they can stop addressing concerns over weapons that would select and attack targets without further human intervention” said Mary Wareham of Human Rights Watch, coordinator of the Campaign to Stop Killer Robots. “If the CCW is unable to act, nations must find other ways to maintain the momentum toward a ban,” she said. “Countries that agree with the need to retain human control of weapons systems should move swiftly to adopt national policies and laws and to negotiate a new international treaty prohibiting fully autonomous weapons.”
The call for a preemptive ban on fully autonomous weapons has been endorsed by 19 countries and dozens more states have affirmed the need to retain human control over the selection of targets and use of force. This clearly indicates that they see a need to prevent the development of fully autonomous weapons. Last December, China became the first permanent member of the UN Security Council to find that new international law is required to regulate fully autonomous weapons.
The Campaign calls on Canada and all countries to urgently address the enormous humanitarian challenges posed by these weapons by endorsing the call for a ban. It is vital and urgent that all stakeholders work together to secure a new international treaty before these weapons are unleashed.
“Canada has a long history of taking action when the CCW is unable to move forward,” said Paul Hannon, Executive Director of Mines Action Canada, a co-founder of the Campaign to Stop Killer Robots. “We are calling on Canada to act now to ensure that there is always meaningful human control over weapons. The international community cannot let the work done thus far go to waste.”
The Campaign to Stop Killer Robots fundamentally objects to permitting machines to take a human life on the battlefield or in policing, border control, and other circumstances. It calls for a preemptive ban on fully autonomous weapons through new international law as well as through domestic legislation.
Following the launch of the Campaign to Stop Killer Robots and a debate in the Human Rights Council, countries agreed in November 2013 to begin discussing what they called lethal autonomous weapons systems at the Convention on Conventional Weapons at the United Nations in Geneva. The CCW is a framework treaty that prohibits or restricts certain weapons and its 1995 protocol on blinding lasers is an example of a weapon being preemptively banned before it was acquired or used.
Most of the CCW’s 124 high contracting parties participated in three meetings on lethal autonomous weapons systems in 2014-2016, in addition to UN agencies, the International Committee of the Red Cross, and the Campaign to Stop Killer Robots. Last December at their Fifth Review Conference CCW states decided to formalize and expand those deliberations by establishing a Group of Governmental Experts on lethal autonomous weapons systems to meet in August and November 2017, chaired by Ambassador Amandeep Singh Gill of India.
However, on 30 May, the CCW’s president-designate Ambassador Matthew Rowland of the UK announced that the Group of Governmental Experts meeting scheduled for 21-25 August has been cancelled due to a lack of funds. Previously Rowland issued several warnings that that the lack of payment of assessed financial contributions would mean the likely cancellation of CCW meetings planned for 2017.
Several countries have financial arrears from previous years, but according to the UN’s official summary, Brazil accounts for 86 percent of the outstanding contributions due to four core humanitarian disarmament treaties, including the CCW. Brazil last paid its assessed CCW contributions in 2010. The Campaign to Stop Killer Robots has appealed to Brazil to pay its outstanding contributions without delay and it challenges CCW states to achieve cost saving measures in other ways that do not require the cancellation of key meetings.
Several autonomous weapons systems with various degrees of human control are currently in use by high-tech militaries including CCW states the US, China, Israel, South Korea, Russia, and the UK. The concern is that low-cost sensors and advances in artificial intelligence are making it increasingly practical to design weapons systems that would target and attack without any meaningful human control. If the trend towards autonomy continues, humans will start to fade out of the decision-making loop, first retaining only a limited oversight role, and then no role at all.
Canada, France, UK, and the US supported establishing the CCW Group of Governmental Experts last December, but remain unambitious in their overall goals for the process by proposing a focus on sharing best practices and achieving greater transparency in the conduct of legal reviews of new weapons systems. Russia openly opposed the creation of a Group of Governmental Experts, but did not block multilateral consensus for establishing one.
Originally published on the Forum on the Arms Trade’s Looking Ahead blog, Erin Hunt looks at opportunities and challenges ahead in 2017 for efforts to preemptively ban autonomous weapons systems.
2017 has the potential to be a pivotal year in efforts to ensure that all weapons have meaningful human control. For three years, the Convention on Conventional Weapons (CCW) has been discussing lethal autonomous weapons (future weapons that could select and fire upon a target without human control). In December 2016, the Review Conference of the CCW decided to establish a Group of Governmental Experts (GGE) chaired by Ambassador Amandeep Singh Gill of India which will meet over 10 days in 2017 and then report-back to the CCW’s annual meeting on 22-24 November.
A GGE is a more formal level of meetings than the ones held in 2014, 2015 and 2016. States will be expected to bring their own experts and participate actively in discussions, instead of listening to presentations by outside experts and asking questions of those experts. The first meeting of the GGE will be held at the UN in Geneva on either 24-28 April or 21-25 August 2017. The date is dependent on when funds are available for the meeting. The second meeting of the GGE will be on 13-17 November, just before the annual CCW meeting.
In 2016, the number of states calling for a pre-emptive ban on fully autonomous weapons more than doubled. At the time of writing, Algeria, Argentina, Bolivia, Chile, Costa Rica, Cuba, Ecuador, Egypt, Ghana, Guatemala, Holy See, Mexico, Nicaragua, Pakistan, Panama, Peru, State of Palestine, Venezuela and Zimbabwe have called for a ban while a number of other states seem to support new international humanitarian law of some sort to deal with autonomous weapons systems.
This GGE is a large step towards a pre-emptive ban on autonomous weapons systems but there are a number of challenges ahead in 2017. First, the Russian Federation continues to object to more formal talks on autonomous weapon systems on the grounds that it is premature to move forward since there is not a clear understanding of the subject under discussion. That objection forgets that definitions are usually the last part of disarmament treaties to be negotiated. It was only at the very end of the 2016 CCW Review Conference that Russia agreed to not block the GGE.
Second, the majority of states, including my own, Canada, do not have national policies on autonomous weapons systems. However, this challenge is also an opportunity. The Campaign to Stop Killer Robots will be working hard around the world in 2017 to support the development of national policies on autonomous weapons systems. After three years of informal CCW experts meetings as well as discussions in the Human Rights Council, states have a large amount of information at their disposal to begin to craft national policies. States can also hold consultations on creating a national policy in advance of the GGE meetings.
Third, there is the possibility that the GGE may become distracted by the inclusion of a discussion item on best practices and greater transparency in Article 36 weapons reviews. These legal reviews are an obligation of states developing, purchasing or otherwise acquiring new weapons.
Although Article 36 weapons reviews should be a topic of discussion at the international level to strengthen both policy and practice around the world, better weapons reviews will not solve the problems associated with autonomous weapons systems and should not distract the GGE from the core of its work. Weapons reviews cannot answer moral, ethical, and political questions. An Article 36 review cannot tell us if it is acceptable to the public conscience for a machine to kill without meaningful human control. Autonomous weapons systems are often referred to as a revolution in warfare; and as such, moral, ethical and political considerations must not be pushed aside. These questions need to remain on the international agenda in 2017.
This year, we will witness significant work done at the national and international level to increase understanding of the challenges posed by autonomous weapons as well as the number of states calling for a pre-emptive ban. Stay tuned to see if the international community stands ready at year’s end to ensure that all weapons have meaningful human control.
The Convention on Conventional Weapons (CCW) Review Conference in December will decide if they will hold a Group of Governmental Experts (GGE) meeting on autonomous weapons systems in 2017. A GGE is the logical next step in the work to address concerns about autonomous weapons systems (or killer robots).
The Campaign to Stop Killer Robots is getting ready for the Review Conference here in Canada and around the world. Check out our colleagues at Reaching Critical Will for an update on the Preparatory Meeting of the CCW to see how the international preparations are going.
On the Canadian side, our Program Coordinator, Erin Hunt, was pleased to deliver the Campaign’s statement to the United Nations General Assembly’s First Committee on October 12.
Over the next month and a bit, we will be talking with parliamentarians, civil society and academics to help ensure that Canada takes a strong position at the Review Conference and beyond. You can help by writing your MP to ask that Canada outline a national policy on autonomous weapons or by donating online to support our work.
Today Mines Action Canada Program Coordinator made an intervention during CCW discussions about autonomous weapons systems and weapons review processes.
Thank you Madame Chair. I would like to take this opportunity to share Mines Action Canada’s observations about Article 36 reviews.
Like many others, Mines Action Canada was concerned to learn that there was so little transparency around Article 36 weapons reviews at last year’s experts meeting. The fact that so few states were willing to discuss their weapons review process is a significant impediment to the prevention of humanitarian harm caused by new weapons. Indeed it seems that too few states actually undertake these reviews in a comprehensive manner.
Last year’s revelations concerning Article 36 reviews have made it clear that international discussions on the topic are necessary. Today is a start. States need to be more transparent in their weapons review processes. Sharing criteria and standards or setting international standards will do much to shed light on the shadowy world of arms procurement. Mines Action Canada believes that Article 36 weapons reviews should be a topic of discussion at the international level to strengthen both policy and practice around the world.
However, better weapons reviews will not solve the problems associated with autonomous weapons systems for a number of reasons.
First, there is the issue of timing. A successful international process to increase the effectiveness of weapons reviews will require a significant amount of time – time we do not have in the effort to prevent the use of autonomous weapons systems because technology is developing too rapidly.
Second, weapons reviews were designed for a very different type of weapon than autonomous weapon systems which have been called the third revolution in warfare. Autonomous weapons systems will blur the line between weapon and soldier to a level that may be beyond the ability of a weapons review process. In addition, the systemic complexity that will be required to operate such a weapons system is a far cry from the more linear processes found in current weapons.
Third, Article 36 reviews are not obligated to cover weapons used for domestic purposes outside of armed conflict such as policing, border control, or crowd control. Mines Action Canada, along with many civil society organizations and states present here, have serious concerns about the possible use of autonomous weapons systems in law enforcement and uses outside of armed conflict more generally.
Fourth and most importantly, weapons reviews cannot answer the moral questions surrounding delegating the kill decision to a machine. An Article 36 review cannot tell us if it is acceptable for an algorithm to kill without meaningful human control. And that is one of the key questions we are grappling with here this week.
Article 36 weapons reviews are a legal obligation for most of the states here. It is time for a separate effort to strengthen the standards and transparency around weapons reviews. That effort must neither distract from nor overtake our work here to deal with the real moral, legal, ethical and security problems associated with autonomous weapons systems. Weapons reviews must be supplemented by new and robust international law that clearly and deliberately puts meaningful human control at the centre of all new weapons development.
The concerns raised by autonomous weapons are urgent and must take priority. In fact, a GGE next year on autonomous weapons will greatly assist future work on weapons reviews by highlighting the many challenges new technologies pose for such reviews.
Overall, there is a need for international work to improve Article 36 reviews but there is little evidence to back up the claims of some states that weapons review processes would be sufficient to ensure that autonomous weapons systems are acceptable. Article 36 reviews are only useful once questions of the moral and ethical acceptability of a weapon have been dealt with. Until that time, it would be premature to view weapons review as a panacea to our issues here at CCW.
Our Executive Director, Paul Hannon delivered an opening statement at the CCW meeting on autonomous weapons systems today.
Thank you, Chairperson.
I appreciate the opportunity to speak on behalf of Mines Action Canada. Mines Action Canada is a Canadian disarmament organization that has been working to reduce the humanitarian impact of indiscriminate weapons for over twenty years. During this time, we have worked with partners around the world including here at the CCW to respond to the global crisis caused by landmines, cluster munitions, and other indiscriminate weapons. What makes this issue different is we have an opportunity to act now before a weapon causes a humanitarian catastrophe.
As a co-founder of the Campaign to Stop Killer Robots, Mines Action Canada’s concern with the development of autonomous weapons systems runs across the board. We have numerous legal, moral/ethical, technical, operational, political, and humanitarian concerns about autonomous weapons systems. The question of the acceptability of delegating death is not an abstract thought experiment, but is the fundamental question with policy, legal and technological implications for the real-world. We must all keep this question at the fore whenever discussing autonomous weapons systems: do you want to live in a world where algorithms or machines can make the decision to take a life? War is a human activity and removing the human component in war is dangerous for everybody. We strongly support the position of the Campaign to Stop Killer Robots that permitting machines to take a human life on the battlefield or in policing, border or crowd control, and other circumstances is unacceptable.
We have watched the development of discourse surrounding autonomous weapons systems since the beginning of the campaign. 2015 saw a dramatic expansion of the debate into different forums and segments of our global community and that expansion and the support it has generated have continued into 2016. Be it at artificial intelligence conferences, the World Economic Forum, the Halifax Security Forum or in the media, the call for a pre-emptive ban is reaching new audiences. The momentum towards a pre-emptive ban on autonomous weapons systems is clearly growing.
Mines Action Canada recognizes that there are considerable challenges facing the international community in navigating legal issues concerning an emerging technology. The desire to not hinder research and development into potentially beneficial technologies is understandable, but a pre-emptive ban on autonomous weapons systems will not limit beneficial research. As a senior executive from a robotics company representative told us at a workshop on autonomous weapons last week, there are no other applications for an autonomous system which can make a “kill or not kill” decision. The function providing an autonomous weapon the ability to make the “kill decision” and implement it does not have an equivalent civilian use. A pre-emptive ban would have no impact on the funding of research and development for artificial intelligence nor robotics.
On the other hand there are numerous other applications that would benefit society by improving other aspects of robot weapons while maintaining meaningful human control over the decision to cause harm. Communications technology, encryption, virtual reality, sensor technology – all have much broader and beneficial applications, from search and rescue by first responders to watching a school play when you can’t be there in person. None of that research and development would be hindered by a pre-emptive ban on autonomous weapons systems. A pre-emptive ban would though allow governments, private sector and academics to direct investments towards technologies which can have as much future benefit to non-military uses as possible.
While the “kill decision” function is only necessary for one application of robotic technology, predictability is an important requirement for all robots regardless of the context in which they are used. Manufacturing robots work well because they work in a predictable space. Driverless cars will also work in a predictable space though much less predictable than a factory, which is one of the reasons they require so much more testing and time to develop. Robotic weapons will be required to work in the least predictable of spaces, that is in combat and, therefore, are much more prone to failure. Commanders on the other hand need weapons they can rely on. Civilians need and have a right to expect that every effort is taken to protect them from the harmful effects of conflict.
Mines Action Canada appreciates the significant number of expert presentations scheduled for this week but we hope that states will take time to share their views throughout the week. It is time for states to begin to talk about their concerns, their positions and their policies. For this reason, we are calling on the High Contracting Parties to take the next step later this year at the Review Conference and mandate a GGE with a mandate to negotiate a new protocol on autonomous weapons.
We note that in the last 20 years three new legal instruments have entered into force. Each bans a weapon system and each was covered by the general rules of International Humanitarian Law at the time, but the international community felt that new specific laws banning these weapons was warranted. This not only strengthened the protection of civilians, but also made IHL more robust.
Autonomous weapons systems are not your average new weapon; they have the potential to fundamentally alter the nature of conflict. As a “game-changer” autonomous weapons systems deserve a serious and in-depth discussion. That discussion should also happen at the national level. Mines Action Canada hopes that our country will begin that effort this spring through the recently announced defence review and that other states will follow suit with their own national discussions.
At the core of this work is a desire to protect civilians and limit the humanitarian harm caused by armed conflict. We urge states not to lose sight of the end goal and their motivations as they complete the difficult work necessary for a robust and effective pre-emptive ban.
We’re almost a month into 2016 and autonomous weapons systems have already been in the news thanks to a strong panel discussion at the World Economic Forum in Davos. The Campaign to Stop Killer Robots was pleased to see the panel agree that the world needs a diplomatic process to pre-emptively ban autonomous weapons systems started soon. You can read the whole analysis by the Campaign’s coordinator here.
Yes 2016 is starting on a high note for the campaign but this is not the time to be complacent. We need to keep that momentum going internationally and here in Canada. The new government has yet to share a national policy on autonomous weapons systems. Before the election, the Liberal Party of Canada wrote that:
“Emerging technologies such as Lethal Autonomous Weapon Systems pose new and serious ethical questions that must be studied and understood. The Liberal Party of Canada will work with experts and civil society to ensure that the Canadian Government develops appropriate policies to address the use and proliferation of autonomous weapon systems.”
Now that the Liberals form the government, they will have to develop “appropriate policies” soon because the international community is moving forward, albeit verrrrrry slowly. States are meeting in April 2016 for a third (and hopefully final) informal experts meeting on autonomous weapons systems under the United Nations’ Convention on Conventional Weapons and then at the end of the year, states will have the opportunity to start negotiations on a pre-emptive ban. The UN process has been called “glacial” and that it “shows no sense of urgency” but there’s time for states to pick up the pace and Canada can take a leadership role.
Canadian industry, academics and NGOs have already taken a leadership role on banning autonomous weapon systems so now it’s the government’s turn. The Canadian government and Prime Minister Trudeau made a big impression at the World Economic Forum so we hope that they will take that energy forward to act on one of newist issues discussed there. Let’s make 2016 a year of action on autonomous weapons systems.