Category Archives: Executive Director
Mines Action Canada and the Campaign to Stop Killer Robots have been busy talking about autonomous weapons this winter.
MAC Executive Director, Paul Hannon, traveled to Halifax to speak to the Canadian International Council’s (CIC) local AGM. In his talk, he shared the game plan to stop killer robots drawing on lessons from the Ottawa Treaty banning landmines. The CIC posted Paul’s accompanying blog post to this lecture which you can find online. The blog post states quite clearly it’s decision time for Canada on autonomous weapons.
“The third revolution in warfare is coming fast. Unlike most revolutions we know this one is coming. What is even more unusual is that we can stop this revolution before it starts. Before anyone is injured or killed. It will take a lot of political will by many countries including Canada. Do we have the will and more importantly the courage to use it?”
Mary Wareham, the Campaign to Stop Killer Robots’ coordinator, spoke to the prestigious Munich Security Conference in February. A public event on artificial intelligence and modern conflict organized by the conference saw common views emerge from different perspectives against weapons that, once activated, could identify, select and attack targets without further human intervention. The event opened with remarks by a “robot” and featured a panel where Mary spoke alongside the president of Estonia, a general from Germany, and a former head of NATO. The recap of that event is available on the global campaign’s website.
One of the Campaign to Stop Killer Robots’ co-founders Noel Sharkey of the International Committee for Robot Arms control will be speaking Halifax on March 21. Noel will debate Duncan MacIntosh, Professor of Philosophy at Dalhousie University, on the role of autonomous weapons and the question to what degree should we be concerned? More details are available here.
On March 28, Erin Hunt, Program Coordinator will join ThePANEL to discuss autonomous weapons and the campaign. The AI Arms Race: Should We Be Worried? brings together experts from Canada and the U.S. to debate the impact of AI on global politics and human rights. Tickets are available online.
Wherever we are talking to the public about autonomous weapons, one thing is clear: Canadians, like others around the world, are expecting their government to come up with a plan to prevent the development of autonomous weapons soon. In order to make that happen, MAC and the Campaign to Stop Killer Robots are working hard in preparation for the Group of Governmental Experts meeting in Geneva in April.
This summer, our Executive Director, Paul Hannon, spoke with Bloomberg TV about autonomous weapons systems. You can see the whole interview here.
Our Executive Director, Paul Hannon delivered an opening statement at the CCW meeting on autonomous weapons systems today.
Thank you, Chairperson.
I appreciate the opportunity to speak on behalf of Mines Action Canada. Mines Action Canada is a Canadian disarmament organization that has been working to reduce the humanitarian impact of indiscriminate weapons for over twenty years. During this time, we have worked with partners around the world including here at the CCW to respond to the global crisis caused by landmines, cluster munitions, and other indiscriminate weapons. What makes this issue different is we have an opportunity to act now before a weapon causes a humanitarian catastrophe.
As a co-founder of the Campaign to Stop Killer Robots, Mines Action Canada’s concern with the development of autonomous weapons systems runs across the board. We have numerous legal, moral/ethical, technical, operational, political, and humanitarian concerns about autonomous weapons systems. The question of the acceptability of delegating death is not an abstract thought experiment, but is the fundamental question with policy, legal and technological implications for the real-world. We must all keep this question at the fore whenever discussing autonomous weapons systems: do you want to live in a world where algorithms or machines can make the decision to take a life? War is a human activity and removing the human component in war is dangerous for everybody. We strongly support the position of the Campaign to Stop Killer Robots that permitting machines to take a human life on the battlefield or in policing, border or crowd control, and other circumstances is unacceptable.
We have watched the development of discourse surrounding autonomous weapons systems since the beginning of the campaign. 2015 saw a dramatic expansion of the debate into different forums and segments of our global community and that expansion and the support it has generated have continued into 2016. Be it at artificial intelligence conferences, the World Economic Forum, the Halifax Security Forum or in the media, the call for a pre-emptive ban is reaching new audiences. The momentum towards a pre-emptive ban on autonomous weapons systems is clearly growing.
Mines Action Canada recognizes that there are considerable challenges facing the international community in navigating legal issues concerning an emerging technology. The desire to not hinder research and development into potentially beneficial technologies is understandable, but a pre-emptive ban on autonomous weapons systems will not limit beneficial research. As a senior executive from a robotics company representative told us at a workshop on autonomous weapons last week, there are no other applications for an autonomous system which can make a “kill or not kill” decision. The function providing an autonomous weapon the ability to make the “kill decision” and implement it does not have an equivalent civilian use. A pre-emptive ban would have no impact on the funding of research and development for artificial intelligence nor robotics.
On the other hand there are numerous other applications that would benefit society by improving other aspects of robot weapons while maintaining meaningful human control over the decision to cause harm. Communications technology, encryption, virtual reality, sensor technology – all have much broader and beneficial applications, from search and rescue by first responders to watching a school play when you can’t be there in person. None of that research and development would be hindered by a pre-emptive ban on autonomous weapons systems. A pre-emptive ban would though allow governments, private sector and academics to direct investments towards technologies which can have as much future benefit to non-military uses as possible.
While the “kill decision” function is only necessary for one application of robotic technology, predictability is an important requirement for all robots regardless of the context in which they are used. Manufacturing robots work well because they work in a predictable space. Driverless cars will also work in a predictable space though much less predictable than a factory, which is one of the reasons they require so much more testing and time to develop. Robotic weapons will be required to work in the least predictable of spaces, that is in combat and, therefore, are much more prone to failure. Commanders on the other hand need weapons they can rely on. Civilians need and have a right to expect that every effort is taken to protect them from the harmful effects of conflict.
Mines Action Canada appreciates the significant number of expert presentations scheduled for this week but we hope that states will take time to share their views throughout the week. It is time for states to begin to talk about their concerns, their positions and their policies. For this reason, we are calling on the High Contracting Parties to take the next step later this year at the Review Conference and mandate a GGE with a mandate to negotiate a new protocol on autonomous weapons.
We note that in the last 20 years three new legal instruments have entered into force. Each bans a weapon system and each was covered by the general rules of International Humanitarian Law at the time, but the international community felt that new specific laws banning these weapons was warranted. This not only strengthened the protection of civilians, but also made IHL more robust.
Autonomous weapons systems are not your average new weapon; they have the potential to fundamentally alter the nature of conflict. As a “game-changer” autonomous weapons systems deserve a serious and in-depth discussion. That discussion should also happen at the national level. Mines Action Canada hopes that our country will begin that effort this spring through the recently announced defence review and that other states will follow suit with their own national discussions.
At the core of this work is a desire to protect civilians and limit the humanitarian harm caused by armed conflict. We urge states not to lose sight of the end goal and their motivations as they complete the difficult work necessary for a robust and effective pre-emptive ban.
A key lesson learned from the Canadian led initiative to ban landmines is to not wait until there is a global crisis before taking action. Fifteen years after the Ottawa Treaty banning landmines was opened for signatures there has been remarkable success. However, due to the widespread use of the weapon before the ban treaty became international law it has taken a considerable amount of effort and resources to lessen that international crisis down to national problem status. Much work remains, but all the trend lines are positive. With continued political will combined with sustained funding this is a crisis that is solvable.
That lesson of taking action before a global crisis exists was an important factor in the Norwegian led initiative to ban cluster munitions. Although a much more high tech weapon than landmines, cluster munitions have caused unacceptable humanitarian harm when they have been used. The indiscriminate effects and the impact they have on innocent civilians resulted in cluster munitions being banned. Fortunately, cluster bombs have not been as widely used as landmines so the 2008 Convention on Cluster Munitions (CCM) is very much a preventive treaty. With tens of millions of cluster submunitions, also known as bomblets, having been destroyed from the stockpiles of states parties to the treaty, the preventive nature of the CCM is already saving countless lives, limbs and livelihoods. However, as with the landmines the use of cluster munitions that had taken place before the treaty came into force means there is much work remaining to clear the existing contamination and the help victims rebuild their shattered lives.
Both landmines and cluster munitions were considered advanced weapons in their day. Landmines were sometimes referred to as the ‘perfect soldier’, but once planted they could not tell the difference between a child or a combatant. Cluster munitions were a much more expensive and sophisticated weapon than landmines yet once dropped or launched the submunitions dispersed from the carrier munition could not distinguish between a soldier and a civilian. Cluster submunitions also had high failure rates and did not explode upon impact as designed leaving behind de facto minefields.
Both landmines and cluster munitions shared the characteristic of not knowing when the conflict had ended so they continued to kill and injure long after peace had happened. In many cases they continued their destructive tasks decades after hostilities had ceased.
Another characteristic they shared is that once humans were no longer involved, i.e. after planting or firing them, the impact of the weapons became immediately problematic. With no human control over whom the target was or when an explosion would occur resulted in a weapons that was indiscriminate by nature which was a key factor in the movements to ban them.
Today in London, England a new campaign will be launched taking the concept of prevention to its full extent by banning a weapon that is not yet in use. Fully autonomous weapons are very much on the drawing boards and in the plans of technologically advanced militaries such as China, Russia, the UK and the US. These weapons pose a wide range of ethical, moral, and legal issues. The Campaign to Stop Killer Robots seeks to raise awareness of those issues and to encourage a pre-emptive ban on the weapons.
Over the past decade, the expanded use of unmanned armed vehicles or drones has dramatically changed warfare, bringing new humanitarian and legal challenges. Now rapid advances in technology are permitting the United States and other nations with high-tech militaries, including China, Israel, Russia, and the United Kingdom, to move toward systems that would give full combat autonomy to machines.
Lethal robot weapons which would be able to select and attack targets without any human intervention take warfare to dangerous and unacceptable levels. The new campaign launched today is a coordinated international coalition of non-governmental organizations concerned with the implications of fully autonomous weapons, also called “killer robots.”
The Campaign to Stop Killer Robots calls for a pre-emptive and comprehensive ban on the development, production, and use of fully autonomous weapons. The prohibition should be achieved through an international treaty, as well as through national laws and other measures.
The term fully autonomous weapons may sound like something from a video game, but they are not. They are lethal weapons and once programmed will not be controlled by anyone. While some may find the idea of machines fighting machines with humans spared the death and destruction of combat appealing, the fact is that will not be the case. We are not talking here about futuristic cyborgs battling each other to death, but about robots designed to kill humans. Thus the name killer robots is simultaneously deadly accurate and highly disturbing.
We live in a world where technology is omnipresent, but we are also well aware of its limitations. While we enjoy the benefits of technology and appreciate those who create and operate them, we also well aware that airplanes sometimes crash, trains derail, ships run aground, cars get recalled, the internet occasionally blacks out (as do power grids), computers freeze, viruses spread via email messages or websites, and, people occasionally end up in the wrong place because of a malfunctioning or poorly programmed GPS device. To use the vernacular “shit happens” or in this case hi-tech shit happens. What could possibly go wrong with arming robots without any meaningful human control?
It would also be comforting to think that since these are very advanced weapons only the “good guys” would have them. However, events in the last two years in Libya, North Korea and Syria, to name a few, would indicate that desperate dictators and rogue states have no problems acquiring the most sophisticated and hi-tech weaponry. If they can get them so can terrorists and criminals.
Scientists and engineers have created some amazing robots which have the potential to greatly improve our lives, but no scientist or engineer should be involved in creating an armed robot that can operate without human control. Computer scientists and engineers have created fabulous devices which have increased our productivity and made life much more enjoyable for millions of people. Those computer experts should never create programs that would allow an armed machine to operate without any human in control.
The hundreds of thousands of landmine and cluster munition victims around the world are testament to the fact that what looks good on the drawing board or in the lab can have deadly consequences for innocent civilians; despite the best intentions or even the best technology that money can buy. We need to learn the key lesson of these two weapons that tragedies can and should be prevented. The time to stop fully autonomous weapons does not begin next week, or next month, or during testing, or after their first use. The time to stop killer robots begins today April 23, 2013 in London, England and wherever you are reading this.
– Paul Hannon