Category Archives: Campaign
The 21st century is marked by technological advancements – like the internet – that have changed the way we live and have made the world more connected. As a result of these advancements, particularly with the internet, privacy laws have changed to adapt to the new norms in order to ensure that technology does not harm citizens. While previously we have had to create laws to adapt to new technologies, in the case of the development of autonomous weaponry it is important that we pre-emptively create laws to limit the negative impacts associated with these technologies. This requires lawmakers to have an in-depth understanding of the technologies and their implications, otherwise they will not be able to effectively legislate.
“How does [hateful information about me] show up on a 7-year-olds iPhone?” asked Iowa representative Steve King (R) to Google CEO Sundar Pichai during his congressional testimony.
The quote above demonstrates the blatant lack of understanding of technology by a U.S. lawmaker. Representative Steve King, 69, asking the CEO of Google about an iPhone problem, which is a product of a different company, is the epitome of not understanding how technology works. This fundamental lack of understanding of technology is troubling. How can we trust lawmakers to create effective laws to limit the negative impacts of more advanced technology such as artificial intelligence (AI) when they don’t even understand basic technology like their cellphones?
The fact is, we have a role to play too. Modern issues require modern solutions, and it’s up to us to hold lawmakers accountable and ensure that they understand the implications of new technologies.
One such technology that is under development is fully autonomous systems. These systems could be used as weapons of war, for surveillance, policing, and border control, giving the government more control over the lives of citizens. Fully autonomous weapons could be drones and other remote controlled vehicles that are programmed to target certain groups indiscriminately without human intervention. While these systems aren’t in use yet, it is an urgent issue that we create legislation to limit their negative impact. If not, the ramifications of this technology would be widespread. Employing autonomous weapons systems increases the possibility that innocent civilians would be harmed during wartime, as well as increases capability for the government to surveil its population. It is important that we maintain a human perspective while in times of conflict to limit these ramifications.
The Campaign to Stop Killer Robots is an international coalition of more than 100 non-governmental organisations across 54 countries working to pre-emptively ban the use of fully autonomous weapons systems. As this technology is developed, international and national lawmakers must work to create a legal framework to ensure that its applications are limited.
This is where students come in. We have grown up in the age of technology, and normally have a very strong grasp on understanding new technology and the implications of it. We are the students of today, but we’ll also be the lawmakers of tomorrow. Because of this, it makes sense to have us involved in the decision making at this early stage, to ensure that a comprehensive legal framework is implemented for this new technology.
You may be wondering, “How can I get involved?” It’s simple! There are lots of ways that students can effectively get involved in spreading the word about the Campaign to Stop Killer Robots, many of which are enabled by modern technology.
Reaching out to your Member of Parliament (MP)/Representative: Writing letters can be a great way to make your MP aware of an issue that you are passionate about. Students can outline their concerns about this new technology and share them with their MP. A well-written letter or email forces the MP to take a critical look at the issue, and if enough people reach out, then they will realize the importance of the issue and act.
Social Media: In the 21st century, social media has been very effective in spreading messages and pushing through change. Media sites such as Twitter or Facebook are very useful tools in spreading information on the Campaign on a national level, which would help more students and citizens get involved. As a first step, you can follow the Campaign to Stop Killer Robots on Twitter at @BanKillerRobots, on Facebook or Instagram at @StopKillerRobots. You can also use the hashtags #KillerRobots #StudentsAgainstKillerRobots #AutonomousWeapons to get involved in the conversation! You can also follow the Campaign on YouTube!
On-Campus Events: What better way to get students engaged than by offering food to them? Students who wish to get involved can get together and host a bake sale or a pizza party to raise awareness on campus for the Campaign. This can be achieved by creating on-campus Campaign representatives across Canada. These reps would be involved in promoting the Campaign on campus, whether it be through events, flyers, class engagements, or other activities. This would be an excellent method to ensure that hundreds of students have daily exposure to the Campaign.
The Campaign to Stop Killer Robots has been working hard to spread their message and to get support for a ban on fully autonomous weaponry. As students, we are the future of our world and are also well versed in technology, and as such, we should be involved in the decision making. We will make a difference in creating effective policy to ensure that this ever-evolving technology is properly regulated. If you would like more information please do not hesitate to reach out to the Campaign on social media!
Tyler Bloom interned at Mines Action Canada as the Campaign Assistant for the Campaign to Stop Killer Robots between January-April 2019. Tyler is a undergraduate student studying Political Science at the University of Ottawa.
Last week, Yoshua Bengio, Geoffrey Hinton, and Yann LeCun were named the winners of the 2018 Turing Award. The A.M. Turing Award, sometimes referred to as computing’s Nobel Prize, is given for major contributions of lasting importance to computing by the Association for Computing Machinery.
All three honourees are supporters of a ban on autonomous weapons, having signed public letters calling on states to take action. Yoshua Bengio and Geoffrey Hinton, who are both based in Canada, spoke with the Toronto Star regarding their concerns about autonomous weapons and the lack of progress on a ban.
While the award was being announced, states were meeting at the United Nations for the Group of Governmental Experts on lethal autonomous weapons systems. The meeting was marked by a minority of states delaying progress sought by the majority of states. States need to heed the concerns and warnings of experts like the 2018 Turing Award winners and the general public who do not want to see autonomous weapons developed.
- In the 26 countries surveyed in 2018, more than three in every five people (61%) responding to a new poll oppose the development of weapons systems that would select and attack targets without human intervention.
- Two-thirds (66%) of those opposed to lethal autonomous weapons systems were most concerned that they would “cross a moral line because machines should not be allowed to kill.”
- More than half (54%) of those opposed said that they were concerned that the weapons would be “unaccountable.”
- A near-identical survey in 23 countries by the same company in January 2017 found that 56% of respondents were opposed to lethal autonomous weapons systems, opposition has grown to 61% as of December 2018.
- A majority opposed killer robots in Canada (60%); China (60%); Russia (59%); the UK (54%); and the US (52%).
The results of this poll show that public sentiment is against the development of killer robots. A minority of states at the 2018 November meeting of the annual Convention on Conventional Weapons at the UN in Geneva, used consensus rules to thwart meaningful diplomatic progress. Russia, Israel, South Korea, and the United States indicated at the meeting that they would not support negotiations for a new treaty. Currently, 28 states are seeking to ban fully autonomous weapons. Austria, Brazil, and Chile have formally proposed the urgent negotiation of “a legally-binding instrument to ensure meaningful human control over the critical functions” of weapons systems.
Mary Wareham of Human Rights Watch, Coordinator of the Campaign to Stop Killer Robots, said:
“The window to prevent the development of fully autonomous weapons is closing fast. This poll shows that public opposition is rising and with it the expectation that governments will act decisively and with urgency to deal with this concern.”
“The results of this poll show that public views in nations often identified as most in favour of killer robots, such as the US and Russia oppose the development of these weapons.
“Efforts to address concerns over killer robots via the Convention on Conventional Weapons have failed to move to negotiate new law due to the objections of a handful of states.”
Six out of ten Canadians respondents to the poll opposed the development of weapons systems that would select and attack targets without human intervention commonly known as autonomous weapons systems or killer robots. Canadian opposition to autonomous weapons was on par with the global results but support for such weapons was below the global average with only 15 percent of Canadian respondents supporting the use of autonomous weapons while 25 percent were not sure.
“Canada has a history of leadership on peace and disarmament coupled with a strong AI sector who has been quite outspoken on this issue. A national ban on autonomous weapons systems and leadership internationally are in line with both a feminist foreign policy and the emphasis the government has put on AI as a future driver of the Canadian economy,” said Erin Hunt, Program Manager at Mines Action Canada, a co-founder of the Campaign to Stop Killer Robots.
“The Government of Canada should take note and listen to the voice of their people. Our shared security and humanity hinges on retaining meaningful human control over the use of force. Now is the time for political leadership to begin negotiations of a new treaty to prohibit fully autonomous weapons systems.”
The survey by the market research company Ipsos was commissioned by the Campaign to Stop Killer Robots and conducted in December 2018. Sample size 500 – 1,000 people in each country.
Today, Mines Action Canada launched a new briefing paper on bias, intersectionality and autonomous weapons systems. Read the briefing note here.
The Campaign to Stop Killer Robots has hired their first full time staff person. Isabelle Jones is the Campaign to Stop Killer Robot’s Project Officer based in Ottawa with Mines Action Canada. As she gets settled into her new role, we sat down to chat.
MAC: You have an educational and work background in human rights and global development. When (and why) did you become interested in disarmament issues?
IJ: In my fourth year of undergraduate studies in Global Development, I moved towards focusing my studies on the intersection of development and conflict – how development happens or stalls in complex contexts, fragile regions, and in the post-conflict period. In one of my classes we watched a clip from a documentary on the impact that landmines have had – and continue to have – in Cambodia. I was already a little familiar with landmines and landmine action after participating in a Red Cross presentation on the topic, but watching that documentary it seemed to click that these weapons weren’t just devastating at the point of detonation, but could continue to impact the development of communities, regions, and even countries long after conflict ends.
After class, some quick Internet searching led me to the International Campaign to Ban Landmines (ICBL), and then the Mines Action Canada (MAC) website. Learning about MAC’s work and their youth internship program, I decided that the 8-month internship would be the perfect post-graduation work opportunity. I could take the year to learn more about humanitarian disarmament and the long process of recovery that follows conflict, and then apply to grad school. Unfortunately, timing was not on my side. The start date for the program shifted and I wouldn’t complete my program in time to be eligible, but that interest in disarmament work never went away
MAC: And your interest in weapons technology?
IJ: I started thinking more and more about weapons technology. How has military technology, and the militarization of technology evolved since the laws of war were codified? How does this impact the lives and rights of civilians? And what does it say about how society views and values the human cost of war? I applied for my Master’s program with a proposal to research the use of drone technology in international and non-international armed conflicts, and the implications of this technology for international human rights and international humanitarian law. Over the course of my research my focus shifted slightly, and ultimately my dissertation argued that drone technology is deployed within a modern, bureaucratized system of labour – an institutional structure that can condition, shape and blind people to partake in morally, ethically and legally questionable acts of violence.
MAC: How did you learn about the Campaign to Stop Killer Robots?
IJ: Several members of the Campaign to Stop Killer Robots, like Article 36, HRW, ICRAC and PAX, have written and published research on armed drones, so I came across them in my dissertation research. This led me to learn about the work of the campaign, which I continued to follow throughout my studies and after their completion. I saw the proliferation of armed drones as a precursor to the lethal autonomous weapons systems that the campaign works to prohibit, and agreed with the campaign’s stance that it is essential to maintain human control over combat weapons. I have followed the work of the campaign closely and am honoured to be joining such a dedicated, passionate team of campaigners!
MAC: You will be working out of the Mines Action Canada office. What do you know about MAC’s work in humanitarian disarmament?
IJ: For decades MAC has been a leader in the global disarmament community, playing key roles in the International Campaign to Ban Landmines, Cluster Munition Coalition and (of course) the Campaign to Stop Killer Robots. Working nationally and internationally, MAC seeks to eliminate the consequences of indiscriminate weapons – weapons that cannot differentiate between civilians and combatants, and legitimate or illegitimate targets. This is the work that first sparked my interest and passion in humanitarian disarmament. After first hoping to become a MAC intern all those years ago, I am thrilled to now be working out of the Mines Action Canada office.
MAC: What are you most looking forward to in your new job?
IJ: I am most looking forward to the variety of the work. There is something very exciting about working in an environment where every day is a little different and there are always new challenges and opportunities to learn landing on your desk – which I think is part of the nature of working on a campaign that is small on staff, big on goals!
MAC: What do you like doing in your spare time?
IJ: In my spare time I love getting outdoors − camping, hiking, canoeing, scuba diving – and exploring new places through travel. Next on my travel bucket list is hiking in the Patagonia region of Chile. I am also an avid reader and you can often find me curled up on the couch with a new book, or re-reading one of my favourites for the umpteenth time.
Delivered by: Erin Hunt, Programme Coordinator
Thank you Mr. Chair. As a co-founder of the Campaign to Stop Killer Robots, Mines Action Canada is very conscious of public opinion and the public conversation concerning autonomous weapons systems. Recently, autonomous weapon systems have been in the news in Canada. Last week, over 200 Canadian Artificial Intelligence experts released an open letter to Prime Minister Justin Trudeau calling for Canadian leadership on the issue. The letter states [quote] “Lethal autonomous weapons systems that remove meaningful human control from determining the legitimacy of targets and deploying lethal force sit on the wrong side of a clear moral line.” [unquote]
Copies of this letter can be found at the back of the room. It is not only in Canada where the AI research community is speaking out – a similar letter was also released in Australia. As mentioned by my colleague, since the last time the CCW met a letter from over 100 founders of robotics and artificial intelligence companies calling for a preemptive ban on autonomous weapons was also released. Additional national letters are in the works.
These public letters show that concerns about possible negative impacts of a pre-emptive ban are misplaced as ICRAC made clear moments ago and what the research community is calling for is bold and decisive action.
Mines Action Canada appreciates the significant number of expert presentations we have had this week but we hope that states will take time to share their views substantially over the remaining days.
From states who say Article 36 review may be sufficient to deal with our concerns about autonomous weapons systems, we hope to hear how an Article 36 review would be able to assess bias in the data used in machine learning and how comportment with IHL would be ensured by systems that continue to learn after the review.
In light of persistent statements from some delegations that they are uncertain about what we are talking about here, we hope to hear states share their current understanding of autonomous weapons systems. Specific definitions are not needed at this stage but we believe there is more clarity and consensus on these questions than one may think.
We would like to hear more on next steps from states who are calling for a pre-emptive ban. Mines Action Canada would welcome concrete discussions on how to ensure that momentum is not lost on this issue. We lost a week of work in August but as I mentioned at the beginning of my statement, the public conversation about autonomous weapons continues to advance and the people at home expect us to make progress.
This week it is important to continue to build on the work done in the past and to ensure that further discussions take place in 2018. Administrative challenges do not lessen “the need to continue the codification and progressive development of the rules of international law applicable in armed conflict” that is reaffirmed in the Preamble of this Convention. The technology is rapidly advancing and so must our conversations here.
Mines Action Canada welcomes the letter calling for a ban on the weaponization of Artificial Intellegence (AI) from the Canadian AI research community which was sent to Prime Minister Justin Trudeau. This letter follows a number of international letters in recent years (from faith leaders, scientists, Nobel laureates, company founders and others) addressed either to the UN or the global community in support of actions to prevent the development of autonomous weapons.
“This letter is evidence that Canadian AI community wants to see leadership from Canada,” said Paul Hannon, Executive Director, Mines Action Canada. “Clearly Canada should become the 20th country to call for a pre-emptive ban on autonomous weapons and to lead a process to ensure that autonomous weapons systems never arrive on the battlefield.”
More than 200 AI researchers in Canada signed the open letter to the Prime Minister “calling on you and your government to make Canada the 20th country in the world to take a firm global stand against weaponizing AI. Lethal autonomous weapons systems that remove meaningful human control from determining the legitimacy of targets and deploying lethal force sit on the wrong side of a clear moral line.”
The letter goes further asking “Canada to announce its support for the call to ban lethal autonomous weapons systems at the upcoming United Nations Conference on the Convention on Certain Conventional Weapons (CCW). Canada should also commit to working with other states to conclude a new international agreement that achieves this objective.” One of the letter’s authors, Dr. Ian Kerr of the University of Ottawa wrote an op-ed in the Globe and Mail bringing the letter’s message to Canadians from coast to coast to coast.
Dr. Kerr notes, “it is not often that captains of industry, scientists and technologists call for prohibitions on innovation of any sort, let alone an outright ban. But the Canadian AI research community is clear: We must not permit AI to target and kill without meaningful human control. Playing Russian roulette with the lives of others can never be justified. The decision on whether to ban autonomous weapons goes to the core of our humanity.”
This letter has been released one week before the international community meets under the auspices of the CCW to discuss the issue of autonomous weapons systems. Mines Action Canada’s Programme Coordinator, Erin Hunt will be attending the meeting next week in Geneva. She said “in past discussions at the CCW, some states have expressed concern that a prohibition on autonomous weapons systems would have a negative impact on AI research more broadly. This letter and the similar one released by Australian AI experts show that those concerns are misplaced. The AI research community is calling for the opposite – bold and decisive action to prohibit autonomous weapons systems in order to support the development of AI that would benefit humanity.”
The Campaign to Stop Killer Robots is deeply disappointed that the Convention on Conventional Weapons (CCW) has cancelled a crucial week of formal discussions on fully autonomous weapons in August. This step was taken because of the failure of several states, most notably Brazil, to pay their assessed dues for the convention’s meetings.
“The collective failure of countries to find a solution to their financial woes doesn’t mean they can stop addressing concerns over weapons that would select and attack targets without further human intervention” said Mary Wareham of Human Rights Watch, coordinator of the Campaign to Stop Killer Robots. “If the CCW is unable to act, nations must find other ways to maintain the momentum toward a ban,” she said. “Countries that agree with the need to retain human control of weapons systems should move swiftly to adopt national policies and laws and to negotiate a new international treaty prohibiting fully autonomous weapons.”
The call for a preemptive ban on fully autonomous weapons has been endorsed by 19 countries and dozens more states have affirmed the need to retain human control over the selection of targets and use of force. This clearly indicates that they see a need to prevent the development of fully autonomous weapons. Last December, China became the first permanent member of the UN Security Council to find that new international law is required to regulate fully autonomous weapons.
The Campaign calls on Canada and all countries to urgently address the enormous humanitarian challenges posed by these weapons by endorsing the call for a ban. It is vital and urgent that all stakeholders work together to secure a new international treaty before these weapons are unleashed.
“Canada has a long history of taking action when the CCW is unable to move forward,” said Paul Hannon, Executive Director of Mines Action Canada, a co-founder of the Campaign to Stop Killer Robots. “We are calling on Canada to act now to ensure that there is always meaningful human control over weapons. The international community cannot let the work done thus far go to waste.”
The Campaign to Stop Killer Robots fundamentally objects to permitting machines to take a human life on the battlefield or in policing, border control, and other circumstances. It calls for a preemptive ban on fully autonomous weapons through new international law as well as through domestic legislation.
Following the launch of the Campaign to Stop Killer Robots and a debate in the Human Rights Council, countries agreed in November 2013 to begin discussing what they called lethal autonomous weapons systems at the Convention on Conventional Weapons at the United Nations in Geneva. The CCW is a framework treaty that prohibits or restricts certain weapons and its 1995 protocol on blinding lasers is an example of a weapon being preemptively banned before it was acquired or used.
Most of the CCW’s 124 high contracting parties participated in three meetings on lethal autonomous weapons systems in 2014-2016, in addition to UN agencies, the International Committee of the Red Cross, and the Campaign to Stop Killer Robots. Last December at their Fifth Review Conference CCW states decided to formalize and expand those deliberations by establishing a Group of Governmental Experts on lethal autonomous weapons systems to meet in August and November 2017, chaired by Ambassador Amandeep Singh Gill of India.
However, on 30 May, the CCW’s president-designate Ambassador Matthew Rowland of the UK announced that the Group of Governmental Experts meeting scheduled for 21-25 August has been cancelled due to a lack of funds. Previously Rowland issued several warnings that that the lack of payment of assessed financial contributions would mean the likely cancellation of CCW meetings planned for 2017.
Several countries have financial arrears from previous years, but according to the UN’s official summary, Brazil accounts for 86 percent of the outstanding contributions due to four core humanitarian disarmament treaties, including the CCW. Brazil last paid its assessed CCW contributions in 2010. The Campaign to Stop Killer Robots has appealed to Brazil to pay its outstanding contributions without delay and it challenges CCW states to achieve cost saving measures in other ways that do not require the cancellation of key meetings.
Several autonomous weapons systems with various degrees of human control are currently in use by high-tech militaries including CCW states the US, China, Israel, South Korea, Russia, and the UK. The concern is that low-cost sensors and advances in artificial intelligence are making it increasingly practical to design weapons systems that would target and attack without any meaningful human control. If the trend towards autonomy continues, humans will start to fade out of the decision-making loop, first retaining only a limited oversight role, and then no role at all.
Canada, France, UK, and the US supported establishing the CCW Group of Governmental Experts last December, but remain unambitious in their overall goals for the process by proposing a focus on sharing best practices and achieving greater transparency in the conduct of legal reviews of new weapons systems. Russia openly opposed the creation of a Group of Governmental Experts, but did not block multilateral consensus for establishing one.
Originally published on the Forum on the Arms Trade’s Looking Ahead blog, Erin Hunt looks at opportunities and challenges ahead in 2017 for efforts to preemptively ban autonomous weapons systems.
2017 has the potential to be a pivotal year in efforts to ensure that all weapons have meaningful human control. For three years, the Convention on Conventional Weapons (CCW) has been discussing lethal autonomous weapons (future weapons that could select and fire upon a target without human control). In December 2016, the Review Conference of the CCW decided to establish a Group of Governmental Experts (GGE) chaired by Ambassador Amandeep Singh Gill of India which will meet over 10 days in 2017 and then report-back to the CCW’s annual meeting on 22-24 November.
A GGE is a more formal level of meetings than the ones held in 2014, 2015 and 2016. States will be expected to bring their own experts and participate actively in discussions, instead of listening to presentations by outside experts and asking questions of those experts. The first meeting of the GGE will be held at the UN in Geneva on either 24-28 April or 21-25 August 2017. The date is dependent on when funds are available for the meeting. The second meeting of the GGE will be on 13-17 November, just before the annual CCW meeting.
In 2016, the number of states calling for a pre-emptive ban on fully autonomous weapons more than doubled. At the time of writing, Algeria, Argentina, Bolivia, Chile, Costa Rica, Cuba, Ecuador, Egypt, Ghana, Guatemala, Holy See, Mexico, Nicaragua, Pakistan, Panama, Peru, State of Palestine, Venezuela and Zimbabwe have called for a ban while a number of other states seem to support new international humanitarian law of some sort to deal with autonomous weapons systems.
This GGE is a large step towards a pre-emptive ban on autonomous weapons systems but there are a number of challenges ahead in 2017. First, the Russian Federation continues to object to more formal talks on autonomous weapon systems on the grounds that it is premature to move forward since there is not a clear understanding of the subject under discussion. That objection forgets that definitions are usually the last part of disarmament treaties to be negotiated. It was only at the very end of the 2016 CCW Review Conference that Russia agreed to not block the GGE.
Second, the majority of states, including my own, Canada, do not have national policies on autonomous weapons systems. However, this challenge is also an opportunity. The Campaign to Stop Killer Robots will be working hard around the world in 2017 to support the development of national policies on autonomous weapons systems. After three years of informal CCW experts meetings as well as discussions in the Human Rights Council, states have a large amount of information at their disposal to begin to craft national policies. States can also hold consultations on creating a national policy in advance of the GGE meetings.
Third, there is the possibility that the GGE may become distracted by the inclusion of a discussion item on best practices and greater transparency in Article 36 weapons reviews. These legal reviews are an obligation of states developing, purchasing or otherwise acquiring new weapons.
Although Article 36 weapons reviews should be a topic of discussion at the international level to strengthen both policy and practice around the world, better weapons reviews will not solve the problems associated with autonomous weapons systems and should not distract the GGE from the core of its work. Weapons reviews cannot answer moral, ethical, and political questions. An Article 36 review cannot tell us if it is acceptable to the public conscience for a machine to kill without meaningful human control. Autonomous weapons systems are often referred to as a revolution in warfare; and as such, moral, ethical and political considerations must not be pushed aside. These questions need to remain on the international agenda in 2017.
This year, we will witness significant work done at the national and international level to increase understanding of the challenges posed by autonomous weapons as well as the number of states calling for a pre-emptive ban. Stay tuned to see if the international community stands ready at year’s end to ensure that all weapons have meaningful human control.
The Convention on Conventional Weapons (CCW) Review Conference in December will decide if they will hold a Group of Governmental Experts (GGE) meeting on autonomous weapons systems in 2017. A GGE is the logical next step in the work to address concerns about autonomous weapons systems (or killer robots).
The Campaign to Stop Killer Robots is getting ready for the Review Conference here in Canada and around the world. Check out our colleagues at Reaching Critical Will for an update on the Preparatory Meeting of the CCW to see how the international preparations are going.
On the Canadian side, our Program Coordinator, Erin Hunt, was pleased to deliver the Campaign’s statement to the United Nations General Assembly’s First Committee on October 12.
Over the next month and a bit, we will be talking with parliamentarians, civil society and academics to help ensure that Canada takes a strong position at the Review Conference and beyond. You can help by writing your MP to ask that Canada outline a national policy on autonomous weapons or by donating online to support our work.