The Campaign to Stop Killer Robots is deeply disappointed that the Convention on Conventional Weapons (CCW) has cancelled a crucial week of formal discussions on fully autonomous weapons in August. This step was taken because of the failure of several states, most notably Brazil, to pay their assessed dues for the convention’s meetings.
“The collective failure of countries to find a solution to their financial woes doesn’t mean they can stop addressing concerns over weapons that would select and attack targets without further human intervention” said Mary Wareham of Human Rights Watch, coordinator of the Campaign to Stop Killer Robots. “If the CCW is unable to act, nations must find other ways to maintain the momentum toward a ban,” she said. “Countries that agree with the need to retain human control of weapons systems should move swiftly to adopt national policies and laws and to negotiate a new international treaty prohibiting fully autonomous weapons.”
The call for a preemptive ban on fully autonomous weapons has been endorsed by 19 countries and dozens more states have affirmed the need to retain human control over the selection of targets and use of force. This clearly indicates that they see a need to prevent the development of fully autonomous weapons. Last December, China became the first permanent member of the UN Security Council to find that new international law is required to regulate fully autonomous weapons.
The Campaign calls on Canada and all countries to urgently address the enormous humanitarian challenges posed by these weapons by endorsing the call for a ban. It is vital and urgent that all stakeholders work together to secure a new international treaty before these weapons are unleashed.
“Canada has a long history of taking action when the CCW is unable to move forward,” said Paul Hannon, Executive Director of Mines Action Canada, a co-founder of the Campaign to Stop Killer Robots. “We are calling on Canada to act now to ensure that there is always meaningful human control over weapons. The international community cannot let the work done thus far go to waste.”
The Campaign to Stop Killer Robots fundamentally objects to permitting machines to take a human life on the battlefield or in policing, border control, and other circumstances. It calls for a preemptive ban on fully autonomous weapons through new international law as well as through domestic legislation.
Following the launch of the Campaign to Stop Killer Robots and a debate in the Human Rights Council, countries agreed in November 2013 to begin discussing what they called lethal autonomous weapons systems at the Convention on Conventional Weapons at the United Nations in Geneva. The CCW is a framework treaty that prohibits or restricts certain weapons and its 1995 protocol on blinding lasers is an example of a weapon being preemptively banned before it was acquired or used.
Most of the CCW’s 124 high contracting parties participated in three meetings on lethal autonomous weapons systems in 2014-2016, in addition to UN agencies, the International Committee of the Red Cross, and the Campaign to Stop Killer Robots. Last December at their Fifth Review Conference CCW states decided to formalize and expand those deliberations by establishing a Group of Governmental Experts on lethal autonomous weapons systems to meet in August and November 2017, chaired by Ambassador Amandeep Singh Gill of India.
However, on 30 May, the CCW’s president-designate Ambassador Matthew Rowland of the UK announced that the Group of Governmental Experts meeting scheduled for 21-25 August has been cancelled due to a lack of funds. Previously Rowland issued several warnings that that the lack of payment of assessed financial contributions would mean the likely cancellation of CCW meetings planned for 2017.
Several countries have financial arrears from previous years, but according to the UN’s official summary, Brazil accounts for 86 percent of the outstanding contributions due to four core humanitarian disarmament treaties, including the CCW. Brazil last paid its assessed CCW contributions in 2010. The Campaign to Stop Killer Robots has appealed to Brazil to pay its outstanding contributions without delay and it challenges CCW states to achieve cost saving measures in other ways that do not require the cancellation of key meetings.
Several autonomous weapons systems with various degrees of human control are currently in use by high-tech militaries including CCW states the US, China, Israel, South Korea, Russia, and the UK. The concern is that low-cost sensors and advances in artificial intelligence are making it increasingly practical to design weapons systems that would target and attack without any meaningful human control. If the trend towards autonomy continues, humans will start to fade out of the decision-making loop, first retaining only a limited oversight role, and then no role at all.
Canada, France, UK, and the US supported establishing the CCW Group of Governmental Experts last December, but remain unambitious in their overall goals for the process by proposing a focus on sharing best practices and achieving greater transparency in the conduct of legal reviews of new weapons systems. Russia openly opposed the creation of a Group of Governmental Experts, but did not block multilateral consensus for establishing one.
This summer, our Executive Director, Paul Hannon, spoke with Bloomberg TV about autonomous weapons systems. You can see the whole interview here.
**CONTEST IS NOW INTERNATIONAL – STUDENTS FROM AROUND THE WORLD WELCOME TO ENTER**
Mines Action Canada is launching a Keep Killer Robots Fiction video contest for students. We are inviting students from
across Canada around the world to make and submit 2 minute video on the theme of “Keep Killer Robots Fiction“.
What is the purpose? The purpose of this competition is to find new, compelling and provocative ways to start a conversation in the public about autonomous weapons systems. Autonomous weapons systems or killer robots are future weapons that can select and fire upon targets without human control.
Killer robots have been a staple trope in fiction and entertainment for years. Over the past decade, the possibility of fully autonomous weapons is becoming closer to reality. Recently we have seen a dramatic rise in unmanned weapons that has changed the face of warfare. New technology is permitting serious efforts to develop fully autonomous weapons. These robotic weapons would be able to choose and fire on targets on their own, without any human intervention. This capability would pose a fundamental challenge to the protection of civilians and to compliance with international human rights and humanitarian law. For clarity it is necessary to note that fully autonomous weapons are not drones; drones have a human pilot in a remote location. Fully autonomous weapons are a large step beyond armed drones. You can learn more about autonomous weapons systems online at: www.stopkillerrobots.ca.
Your submission should illustrate one of the major problems with autonomous weapons systems or ask a question about handing over life and death decisions to a machine:
- A lack of accountability – who is responsible if an autonomous weapon kills the wrong person or malfunctions?
- Inability to distinguish between legitimate and legal targets and others – human soldiers must be able to tell the difference between soldiers and civilians, could a robot ever make that distinction?
- The moral issues surrounding outsourcing life and death decisions to machines – is it right to allow machines to choose to end a human life?
Please don’t limit yourself to these example questions about autonomous weapons, they are intended to inspire you to create some questions of your own to guide your project.
Who can participate? Submissions will be accepted from any contestant between the ages of 18 and 30 who is currently enrolled in post-secondary education.
How do I enter the competition? Submitting your entry to the video contest is easy! Simply complete these three steps by March 15, 2015:
- Visit the contest entry form on our website, and fill in all of the required information.
- Upload your video to Vimeo and specify the location (URL) on the entry form. Memberships to Vimeo are free.
- Submit your online entry form to the Mines Action Canada team.
The Contest Rules and other information can be found in the Video Contest Announcement. Please read the announcement carefully to ensure that your project is eligible for consideration by our panel of expert judges. The contest entry form is available online at: http://goo.gl/forms/0VOGD6mgTp.
Great news! Today, Clearpath Robotics, a robotics firm based in Kitchener, Ontario, announced a world leading policy to “not manufacture weaponized robots that remove humans from the loop” and pledged their support for the Campaign to Stop Killer Robots. In an open letter, Ryan Gariepy, Co-Founder and CTO, writes that “[d]espite our continued involvement with Canadian and international military research and development, Clearpath Robotics believes that the development of killer robots is unwise, unethical, and should be banned on an international scale.”
As a co-founder and the Canadian representatives of the Campaign to Stop Killer Robots, Mines Action Canada welcomes Clearpath Robotics’ decision and applauds their staff for their thoughtful and courageous stance on this issue. “Clearpath Robotics has set the ethical standard for robotics companies around the world. Their pledge to not manufacture autonomous weapons systems demonstrates clearly that research and development into autonomous robots and military robots does not require the creation of ‘killer robots’ and that there are many applications of autonomous robotics that can benefit humanity,” said Paul Hannon, Executive Director, Mines Action Canada. “As Canadian, I am proud that a Canadian company was the first in the world to pledge to not manufacture killer robots.”
As the international community is scheduled to discuss autonomous weapons systems at the United Nations again this fall, Mines Action Canada strongly supports Clearpath Robotics’ pledge and we join them in encouraging “those who might see business opportunities in this technology to seek other ways to apply their skills and resources for the betterment of humankind.” We look forward to similar statements from other robotics companies in Canada and around the world. Members of the public who share Clearpath Robotics’ views can sign the Keep Killer Robots Fiction petition at http://bit.ly/KRpetition while individual roboticists and scientists can join the International Committee for Robot Arms Control’s Scientists’ Call online at: http://icrac.net/call/.
Canadians are among the 270 engineers, computing and artificial intelligence experts, roboticists, and professionals from related disciplines who have signed an experts’ call to ban killer robots. The experts say “given the limitations and unknown future risks of autonomous robot weapons technology, we call for a prohibition on their development and deployment. Decisions about the application of violent force must not be delegated to machines.”
The International Committee for Robot Arms Control (ICRAC) has thus far received 272 signatures from 37 countries on the statement which continues to collect signatures. In an announcement released today, Professor Noel Sharkey, Chair of ICRAC said “Governments need to listen to the experts’ warnings and work with us to tackle this challenge together before it is too late. It is urgent that international talks get started now to prevent the further development of autonomous robot weapons before it is too late.”
Canada does not currently have a policy on fully autonomous weapons and we hope that the government will engage these experts and others as they create the policy. We expect to see additional signatures from Canadian experts as this issue gains momentum. At present, the University of Toronto has the largest numbers of signatories but experts from other organizations and institutions still have time to sign the call. As the quote below from Geoffrey Hinton indicates now is the time to ensure that artificial intelligence and robotic technologies are used for the betterment of humanity.
“Artificial Intelligence can improve people’s lives in so many ways, but researchers need to push for positive applications of technology by supporting a ban on autonomous weapons systems.”
Geoffrey Hinton FRS, [founding father of modern machine learning] Raymond Reiter Distinguished Professor of Artificial Intelligence at the University of Toronto
Prof Noel Sharkey gave a talk at TEDx Sheffield about fully autonomous weapons and the Campaign to Stop Killer Robots. Prof. Sharkey is one of the founders of the International Committee for Robot Arms Control. He delivers a passionate call to action to stop killer robots. Take a few minutes out of your day to see him talk about his journey from a boy who loved toy soldiers growing up in the shadow of World War II to a leading campaigner in the effort to stop killer robots and protect civilians. Plus he even shares a little song about the CIA!
This week, the United Nations Human Rights Council became the first UN body to discuss the issue of killer robots. To mark the occasion, the Campaign to Stop Killer Robots headed to Geneva to introduce our campaign to diplomats, UN agencies and civil society. Check out the full report from the international campaign.
A key lesson learned from the Canadian led initiative to ban landmines is to not wait until there is a global crisis before taking action. Fifteen years after the Ottawa Treaty banning landmines was opened for signatures there has been remarkable success. However, due to the widespread use of the weapon before the ban treaty became international law it has taken a considerable amount of effort and resources to lessen that international crisis down to national problem status. Much work remains, but all the trend lines are positive. With continued political will combined with sustained funding this is a crisis that is solvable.
That lesson of taking action before a global crisis exists was an important factor in the Norwegian led initiative to ban cluster munitions. Although a much more high tech weapon than landmines, cluster munitions have caused unacceptable humanitarian harm when they have been used. The indiscriminate effects and the impact they have on innocent civilians resulted in cluster munitions being banned. Fortunately, cluster bombs have not been as widely used as landmines so the 2008 Convention on Cluster Munitions (CCM) is very much a preventive treaty. With tens of millions of cluster submunitions, also known as bomblets, having been destroyed from the stockpiles of states parties to the treaty, the preventive nature of the CCM is already saving countless lives, limbs and livelihoods. However, as with the landmines the use of cluster munitions that had taken place before the treaty came into force means there is much work remaining to clear the existing contamination and the help victims rebuild their shattered lives.
Both landmines and cluster munitions were considered advanced weapons in their day. Landmines were sometimes referred to as the ‘perfect soldier’, but once planted they could not tell the difference between a child or a combatant. Cluster munitions were a much more expensive and sophisticated weapon than landmines yet once dropped or launched the submunitions dispersed from the carrier munition could not distinguish between a soldier and a civilian. Cluster submunitions also had high failure rates and did not explode upon impact as designed leaving behind de facto minefields.
Both landmines and cluster munitions shared the characteristic of not knowing when the conflict had ended so they continued to kill and injure long after peace had happened. In many cases they continued their destructive tasks decades after hostilities had ceased.
Another characteristic they shared is that once humans were no longer involved, i.e. after planting or firing them, the impact of the weapons became immediately problematic. With no human control over whom the target was or when an explosion would occur resulted in a weapons that was indiscriminate by nature which was a key factor in the movements to ban them.
Today in London, England a new campaign will be launched taking the concept of prevention to its full extent by banning a weapon that is not yet in use. Fully autonomous weapons are very much on the drawing boards and in the plans of technologically advanced militaries such as China, Russia, the UK and the US. These weapons pose a wide range of ethical, moral, and legal issues. The Campaign to Stop Killer Robots seeks to raise awareness of those issues and to encourage a pre-emptive ban on the weapons.
Over the past decade, the expanded use of unmanned armed vehicles or drones has dramatically changed warfare, bringing new humanitarian and legal challenges. Now rapid advances in technology are permitting the United States and other nations with high-tech militaries, including China, Israel, Russia, and the United Kingdom, to move toward systems that would give full combat autonomy to machines.
Lethal robot weapons which would be able to select and attack targets without any human intervention take warfare to dangerous and unacceptable levels. The new campaign launched today is a coordinated international coalition of non-governmental organizations concerned with the implications of fully autonomous weapons, also called “killer robots.”
The Campaign to Stop Killer Robots calls for a pre-emptive and comprehensive ban on the development, production, and use of fully autonomous weapons. The prohibition should be achieved through an international treaty, as well as through national laws and other measures.
The term fully autonomous weapons may sound like something from a video game, but they are not. They are lethal weapons and once programmed will not be controlled by anyone. While some may find the idea of machines fighting machines with humans spared the death and destruction of combat appealing, the fact is that will not be the case. We are not talking here about futuristic cyborgs battling each other to death, but about robots designed to kill humans. Thus the name killer robots is simultaneously deadly accurate and highly disturbing.
We live in a world where technology is omnipresent, but we are also well aware of its limitations. While we enjoy the benefits of technology and appreciate those who create and operate them, we also well aware that airplanes sometimes crash, trains derail, ships run aground, cars get recalled, the internet occasionally blacks out (as do power grids), computers freeze, viruses spread via email messages or websites, and, people occasionally end up in the wrong place because of a malfunctioning or poorly programmed GPS device. To use the vernacular “shit happens” or in this case hi-tech shit happens. What could possibly go wrong with arming robots without any meaningful human control?
It would also be comforting to think that since these are very advanced weapons only the “good guys” would have them. However, events in the last two years in Libya, North Korea and Syria, to name a few, would indicate that desperate dictators and rogue states have no problems acquiring the most sophisticated and hi-tech weaponry. If they can get them so can terrorists and criminals.
Scientists and engineers have created some amazing robots which have the potential to greatly improve our lives, but no scientist or engineer should be involved in creating an armed robot that can operate without human control. Computer scientists and engineers have created fabulous devices which have increased our productivity and made life much more enjoyable for millions of people. Those computer experts should never create programs that would allow an armed machine to operate without any human in control.
The hundreds of thousands of landmine and cluster munition victims around the world are testament to the fact that what looks good on the drawing board or in the lab can have deadly consequences for innocent civilians; despite the best intentions or even the best technology that money can buy. We need to learn the key lesson of these two weapons that tragedies can and should be prevented. The time to stop fully autonomous weapons does not begin next week, or next month, or during testing, or after their first use. The time to stop killer robots begins today April 23, 2013 in London, England and wherever you are reading this.
– Paul Hannon