Category Archives: Executive Director
Statement delivered by Paul Hannon, Mines Action Canada, at the Group of Governmental Experts Meeting in March 2019
Thank you Chairperson
On Monday a delegation mentioned the Convention on Cluster Munitions. Many experiences with that life-saving treaty have relevance to this week’s discussions. Not the least of which is the importance of leaving definitions until you are at the negotiating table.
For many years the refrain by a small number of states here in the CCW was that existing IHL was sufficient to cover cluster munitions, but the large number of states who had a different view found a new pathway to address their concerns. For those who are not yet a State Party the Convention on Cluster Munitions the treaty was negotiated in 2008 and bans an indiscriminate, inhumane and unreliable weapon system that cause unacceptable harm to civilians. It entered into force and became a new part of International Humanitarian Law on August 1st 2010. To date 120 states have joined the CCM and even in those countries that have not yet joined, the treaty has impacts, for example, weapons manufacturers and financial institutions which have ceased to produce or invest in cluster munitions due to the reputational risk and the effects of such risk on their company’s bottom line.
Last week the latest government announced the destruction of their entire stockpile of cluster munitions joining a long list of States Parties which have destroyed their stockpiles including many of those states who spoke this week or in previous GGE sessions on weapons reviews.
Presumably, many of those states that had acquired cluster munitions did Article 36 weapons reviews, using existing IHL at the time. I say presumably because information on these reviews is not publically available.
Yet having undertaken their weapons reviews, then acquiring the weapon, and in some cases using it, they ended up agreeing that existing IHL at that time did need a new instrument specific to that weapon and they then joined the new cluster munitions treaty.
IHL is not static. Since the CCW negotiated Protocol IV, a preventive treaty for a weapon that had never been deployed, five other additions have been made to IHL illustrating that it does evolve and through new legally binding instruments becomes even stronger.
Mines Action Canada believes all states should undertake national weapons reviews before acquiring any weapon. We also believe current processes can be more robust and transparent. Public trust in the viability and acceptability of weapons is very important. Mines Action Canada would be pleased to assist any international efforts to improve these reviews, but that is not the work of this GGE.
Compared to autonomous weapons systems cluster munitions are technologically a much more straightforward weapon. It is hard, therefore, to reconcile the fact that existing IHL was not sufficient for cluster munitions and required a new addition to IHL, but somehow it is sufficient for a new weapon system with emerging, unproven, untested, and unpredictable technologies.
Our experience with the cluster munitions and other weapons leads to the inevitable conclusion that from a humanitarian perspective Article 36 weapons reviews on their own are insufficient to protect civilian populations from autonomous weapons systems.
New international law is needed to address the multitude of concerns with autonomous weapon systems. We believe this is possible by 2020 and urge High Contracting Parties to agree to a negotiating mandate in November.
Chairperson, the ICRC is well known for reminding us that even wars have limits. Mines Action Canada believes that the same applies to autonomy in weapons. Even with autonomy there are limits. It is time to negotiate another legally binding instrument, either here or elsewhere, for three key reasons: firstly to protect civilians; secondly to ensure that research and development of the beneficial uses of these new technologies continues and are not tainted by the stigmatizing impact of fully autonomous weapons; and, finally to come to a common agreement on how retaining meaningful human control will help define those limits to autonomy.
Delivered by Paul Hannon, Executive Director
Thank you Mr. Chairman. As a co-founder of the Campaign to Stop Killer Robots and a long-time advocate for humanitarian disarmament, Mines Action Canada supports the statement delivered by the Campaign’s Coordinator.
In many ways 2017 was a lost year for efforts to prohibit autonomous weapons here so we are hoping to see significant progress at the CCW in 2018.
Outside of these walls though, the conversation about autonomous weapons progressed at the end of 2017 and the start of 2018.
In November, over 200 Canadian Artificial Intelligence experts released an open letter to Prime Minister Justin Trudeau calling for Canadian leadership on autonomous weapons systems. These Canadian experts are still waiting for a response from the government of Canada. Similar national letters have been released in Australia and Belgium.
Two weeks ago the G7 Innovation Ministers released a Statement on Artificial Intelligence which cited the need to increase trust in AI and included a commitment to “continue to encourage research, including […] examining ethical considerations of AI.”.
This week should provide opportunity for states to share and expand on their positions with regards to autonomous weapons systems and the need for meaningful human control. States should not overlook the ethical, humanitarian and human rights concerns about autonomous weapons systems as we delve into some technical topics.
Mr. President, CCW protocols have a history of addressing the ethical and humanitarian concerns about weapons. Protocol IV on blinding laser weapons is particularly relevant to our discussions. As a pre-emptive prohibition on an emerging technology motivated by ethical concerns, Protocol IV has been very effective in preventing the use of an abhorrent weapon without limiting the development of laser technology for other purposes including other military purposes. It is important to note that Protocol IV has some of the widest membership of all the protocols including all five permanent members of the United Nations Security Council, all the states that have chaired the autonomous weapons talks here at the CCW and most of the states who have expressed views about autonomous weapons. All those states are party to a Protocol that banned for ethical reasons a weapon before it was ever deployed in conflict.
Above all, we hope that the states present this week will reflect on the concept of responsibility. The Government of Poland’s working paper which discusses this topic is a useful starting point. We see responsibility as a theme that runs throughout these discussions.
A Canadian godfather of Artificial Intelligence has often spoken of the need to pursue responsible AI. Responsible AI makes life better for society and helps “prevent the misuse of AI applications that could cause harm” as noted in the G7 Annex.
We have been entrusted with a great responsibility here in this room. We have the responsibility to set boundaries and prevent future catastrophes. We must be bold in our actions or we could face a situation where computer programmers become de facto policy makers.
Above all, as part of our collective humanity, we must remain responsible for our actions – we cannot divest control to one of our creations whether it is in our daily actions, or more crucially for this week’s discussion, in our decisions to use weapons.
In the past, those sitting in these seats have met their responsibility to “continue the codification and progressive development of the rules of international law applicable in armed conflict” by negotiating new protocols and in the case of blinding laser weapons a pre-emptive protocol. Now it is our turn and this is our issue to address.
Mines Action Canada and the Campaign to Stop Killer Robots have been busy talking about autonomous weapons this winter.
MAC Executive Director, Paul Hannon, traveled to Halifax to speak to the Canadian International Council’s (CIC) local AGM. In his talk, he shared the game plan to stop killer robots drawing on lessons from the Ottawa Treaty banning landmines. The CIC posted Paul’s accompanying blog post to this lecture which you can find online. The blog post states quite clearly it’s decision time for Canada on autonomous weapons.
“The third revolution in warfare is coming fast. Unlike most revolutions we know this one is coming. What is even more unusual is that we can stop this revolution before it starts. Before anyone is injured or killed. It will take a lot of political will by many countries including Canada. Do we have the will and more importantly the courage to use it?”
Mary Wareham, the Campaign to Stop Killer Robots’ coordinator, spoke to the prestigious Munich Security Conference in February. A public event on artificial intelligence and modern conflict organized by the conference saw common views emerge from different perspectives against weapons that, once activated, could identify, select and attack targets without further human intervention. The event opened with remarks by a “robot” and featured a panel where Mary spoke alongside the president of Estonia, a general from Germany, and a former head of NATO. The recap of that event is available on the global campaign’s website.
One of the Campaign to Stop Killer Robots’ co-founders Noel Sharkey of the International Committee for Robot Arms control will be speaking Halifax on March 21. Noel will debate Duncan MacIntosh, Professor of Philosophy at Dalhousie University, on the role of autonomous weapons and the question to what degree should we be concerned? More details are available here.
On March 28, Erin Hunt, Program Coordinator will join ThePANEL to discuss autonomous weapons and the campaign. The AI Arms Race: Should We Be Worried? brings together experts from Canada and the U.S. to debate the impact of AI on global politics and human rights. Tickets are available online.
Wherever we are talking to the public about autonomous weapons, one thing is clear: Canadians, like others around the world, are expecting their government to come up with a plan to prevent the development of autonomous weapons soon. In order to make that happen, MAC and the Campaign to Stop Killer Robots are working hard in preparation for the Group of Governmental Experts meeting in Geneva in April.
This summer, our Executive Director, Paul Hannon, spoke with Bloomberg TV about autonomous weapons systems. You can see the whole interview here.
Our Executive Director, Paul Hannon delivered an opening statement at the CCW meeting on autonomous weapons systems today.
Thank you, Chairperson.
I appreciate the opportunity to speak on behalf of Mines Action Canada. Mines Action Canada is a Canadian disarmament organization that has been working to reduce the humanitarian impact of indiscriminate weapons for over twenty years. During this time, we have worked with partners around the world including here at the CCW to respond to the global crisis caused by landmines, cluster munitions, and other indiscriminate weapons. What makes this issue different is we have an opportunity to act now before a weapon causes a humanitarian catastrophe.
As a co-founder of the Campaign to Stop Killer Robots, Mines Action Canada’s concern with the development of autonomous weapons systems runs across the board. We have numerous legal, moral/ethical, technical, operational, political, and humanitarian concerns about autonomous weapons systems. The question of the acceptability of delegating death is not an abstract thought experiment, but is the fundamental question with policy, legal and technological implications for the real-world. We must all keep this question at the fore whenever discussing autonomous weapons systems: do you want to live in a world where algorithms or machines can make the decision to take a life? War is a human activity and removing the human component in war is dangerous for everybody. We strongly support the position of the Campaign to Stop Killer Robots that permitting machines to take a human life on the battlefield or in policing, border or crowd control, and other circumstances is unacceptable.
We have watched the development of discourse surrounding autonomous weapons systems since the beginning of the campaign. 2015 saw a dramatic expansion of the debate into different forums and segments of our global community and that expansion and the support it has generated have continued into 2016. Be it at artificial intelligence conferences, the World Economic Forum, the Halifax Security Forum or in the media, the call for a pre-emptive ban is reaching new audiences. The momentum towards a pre-emptive ban on autonomous weapons systems is clearly growing.
Mines Action Canada recognizes that there are considerable challenges facing the international community in navigating legal issues concerning an emerging technology. The desire to not hinder research and development into potentially beneficial technologies is understandable, but a pre-emptive ban on autonomous weapons systems will not limit beneficial research. As a senior executive from a robotics company representative told us at a workshop on autonomous weapons last week, there are no other applications for an autonomous system which can make a “kill or not kill” decision. The function providing an autonomous weapon the ability to make the “kill decision” and implement it does not have an equivalent civilian use. A pre-emptive ban would have no impact on the funding of research and development for artificial intelligence nor robotics.
On the other hand there are numerous other applications that would benefit society by improving other aspects of robot weapons while maintaining meaningful human control over the decision to cause harm. Communications technology, encryption, virtual reality, sensor technology – all have much broader and beneficial applications, from search and rescue by first responders to watching a school play when you can’t be there in person. None of that research and development would be hindered by a pre-emptive ban on autonomous weapons systems. A pre-emptive ban would though allow governments, private sector and academics to direct investments towards technologies which can have as much future benefit to non-military uses as possible.
While the “kill decision” function is only necessary for one application of robotic technology, predictability is an important requirement for all robots regardless of the context in which they are used. Manufacturing robots work well because they work in a predictable space. Driverless cars will also work in a predictable space though much less predictable than a factory, which is one of the reasons they require so much more testing and time to develop. Robotic weapons will be required to work in the least predictable of spaces, that is in combat and, therefore, are much more prone to failure. Commanders on the other hand need weapons they can rely on. Civilians need and have a right to expect that every effort is taken to protect them from the harmful effects of conflict.
Mines Action Canada appreciates the significant number of expert presentations scheduled for this week but we hope that states will take time to share their views throughout the week. It is time for states to begin to talk about their concerns, their positions and their policies. For this reason, we are calling on the High Contracting Parties to take the next step later this year at the Review Conference and mandate a GGE with a mandate to negotiate a new protocol on autonomous weapons.
We note that in the last 20 years three new legal instruments have entered into force. Each bans a weapon system and each was covered by the general rules of International Humanitarian Law at the time, but the international community felt that new specific laws banning these weapons was warranted. This not only strengthened the protection of civilians, but also made IHL more robust.
Autonomous weapons systems are not your average new weapon; they have the potential to fundamentally alter the nature of conflict. As a “game-changer” autonomous weapons systems deserve a serious and in-depth discussion. That discussion should also happen at the national level. Mines Action Canada hopes that our country will begin that effort this spring through the recently announced defence review and that other states will follow suit with their own national discussions.
At the core of this work is a desire to protect civilians and limit the humanitarian harm caused by armed conflict. We urge states not to lose sight of the end goal and their motivations as they complete the difficult work necessary for a robust and effective pre-emptive ban.
A key lesson learned from the Canadian led initiative to ban landmines is to not wait until there is a global crisis before taking action. Fifteen years after the Ottawa Treaty banning landmines was opened for signatures there has been remarkable success. However, due to the widespread use of the weapon before the ban treaty became international law it has taken a considerable amount of effort and resources to lessen that international crisis down to national problem status. Much work remains, but all the trend lines are positive. With continued political will combined with sustained funding this is a crisis that is solvable.
That lesson of taking action before a global crisis exists was an important factor in the Norwegian led initiative to ban cluster munitions. Although a much more high tech weapon than landmines, cluster munitions have caused unacceptable humanitarian harm when they have been used. The indiscriminate effects and the impact they have on innocent civilians resulted in cluster munitions being banned. Fortunately, cluster bombs have not been as widely used as landmines so the 2008 Convention on Cluster Munitions (CCM) is very much a preventive treaty. With tens of millions of cluster submunitions, also known as bomblets, having been destroyed from the stockpiles of states parties to the treaty, the preventive nature of the CCM is already saving countless lives, limbs and livelihoods. However, as with the landmines the use of cluster munitions that had taken place before the treaty came into force means there is much work remaining to clear the existing contamination and the help victims rebuild their shattered lives.
Both landmines and cluster munitions were considered advanced weapons in their day. Landmines were sometimes referred to as the ‘perfect soldier’, but once planted they could not tell the difference between a child or a combatant. Cluster munitions were a much more expensive and sophisticated weapon than landmines yet once dropped or launched the submunitions dispersed from the carrier munition could not distinguish between a soldier and a civilian. Cluster submunitions also had high failure rates and did not explode upon impact as designed leaving behind de facto minefields.
Both landmines and cluster munitions shared the characteristic of not knowing when the conflict had ended so they continued to kill and injure long after peace had happened. In many cases they continued their destructive tasks decades after hostilities had ceased.
Another characteristic they shared is that once humans were no longer involved, i.e. after planting or firing them, the impact of the weapons became immediately problematic. With no human control over whom the target was or when an explosion would occur resulted in a weapons that was indiscriminate by nature which was a key factor in the movements to ban them.
Today in London, England a new campaign will be launched taking the concept of prevention to its full extent by banning a weapon that is not yet in use. Fully autonomous weapons are very much on the drawing boards and in the plans of technologically advanced militaries such as China, Russia, the UK and the US. These weapons pose a wide range of ethical, moral, and legal issues. The Campaign to Stop Killer Robots seeks to raise awareness of those issues and to encourage a pre-emptive ban on the weapons.
Over the past decade, the expanded use of unmanned armed vehicles or drones has dramatically changed warfare, bringing new humanitarian and legal challenges. Now rapid advances in technology are permitting the United States and other nations with high-tech militaries, including China, Israel, Russia, and the United Kingdom, to move toward systems that would give full combat autonomy to machines.
Lethal robot weapons which would be able to select and attack targets without any human intervention take warfare to dangerous and unacceptable levels. The new campaign launched today is a coordinated international coalition of non-governmental organizations concerned with the implications of fully autonomous weapons, also called “killer robots.”
The Campaign to Stop Killer Robots calls for a pre-emptive and comprehensive ban on the development, production, and use of fully autonomous weapons. The prohibition should be achieved through an international treaty, as well as through national laws and other measures.
The term fully autonomous weapons may sound like something from a video game, but they are not. They are lethal weapons and once programmed will not be controlled by anyone. While some may find the idea of machines fighting machines with humans spared the death and destruction of combat appealing, the fact is that will not be the case. We are not talking here about futuristic cyborgs battling each other to death, but about robots designed to kill humans. Thus the name killer robots is simultaneously deadly accurate and highly disturbing.
We live in a world where technology is omnipresent, but we are also well aware of its limitations. While we enjoy the benefits of technology and appreciate those who create and operate them, we also well aware that airplanes sometimes crash, trains derail, ships run aground, cars get recalled, the internet occasionally blacks out (as do power grids), computers freeze, viruses spread via email messages or websites, and, people occasionally end up in the wrong place because of a malfunctioning or poorly programmed GPS device. To use the vernacular “shit happens” or in this case hi-tech shit happens. What could possibly go wrong with arming robots without any meaningful human control?
It would also be comforting to think that since these are very advanced weapons only the “good guys” would have them. However, events in the last two years in Libya, North Korea and Syria, to name a few, would indicate that desperate dictators and rogue states have no problems acquiring the most sophisticated and hi-tech weaponry. If they can get them so can terrorists and criminals.
Scientists and engineers have created some amazing robots which have the potential to greatly improve our lives, but no scientist or engineer should be involved in creating an armed robot that can operate without human control. Computer scientists and engineers have created fabulous devices which have increased our productivity and made life much more enjoyable for millions of people. Those computer experts should never create programs that would allow an armed machine to operate without any human in control.
The hundreds of thousands of landmine and cluster munition victims around the world are testament to the fact that what looks good on the drawing board or in the lab can have deadly consequences for innocent civilians; despite the best intentions or even the best technology that money can buy. We need to learn the key lesson of these two weapons that tragedies can and should be prevented. The time to stop fully autonomous weapons does not begin next week, or next month, or during testing, or after their first use. The time to stop killer robots begins today April 23, 2013 in London, England and wherever you are reading this.
– Paul Hannon