Author Archives: minesactioncanada

The Future of Autonomous Weaponry: A Student’s Perspective

The 21st century is marked by technological advancements – like the internet – that have changed the way we live and have made the world more connected. As a result of these advancements, particularly with the internet, privacy laws have changed to adapt to the new norms in order to ensure that technology does not harm citizens. While previously we have had to create laws to adapt to new technologies, in the case of the development of autonomous weaponry it is important that we pre-emptively create laws to limit the negative impacts associated with these technologies. This requires lawmakers to have an in-depth understanding of the technologies and their implications, otherwise they will not be able to effectively legislate.

“How does [hateful information about me] show up on a 7-year-olds iPhone?” asked Iowa representative Steve King (R) to Google CEO Sundar Pichai during his congressional testimony.

The quote above demonstrates the blatant lack of understanding of technology by a U.S. lawmaker. Representative Steve King, 69, asking the CEO of Google about an iPhone problem, which is a product of a different company, is the epitome of not understanding how technology works. This fundamental lack of understanding of technology is troubling. How can we trust lawmakers to create effective laws to limit the negative impacts of more advanced technology such as artificial intelligence (AI) when they don’t even understand basic technology like their cellphones?

The fact is, we have a role to play too. Modern issues require modern solutions, and it’s up to us to hold lawmakers accountable and ensure that they understand the implications of new technologies.

One such technology that is under development is fully autonomous systems. These systems could be used as weapons of war, for surveillance, policing, and border control, giving the government more control over the lives of citizens. Fully autonomous weapons could be drones and other remote controlled vehicles that are programmed to target certain groups indiscriminately without human intervention. While these systems aren’t in use yet, it is an urgent issue that we create legislation to limit their negative impact. If not, the ramifications of this technology would be widespread. Employing autonomous weapons systems increases the possibility that innocent civilians would be harmed during wartime, as well as increases capability for the government to surveil its population. It is important that we maintain a human perspective while in times of conflict to limit these ramifications.

The Campaign to Stop Killer Robots is an international coalition of more than 100 non-governmental organisations across 54 countries working to pre-emptively ban the use of fully autonomous weapons systems. As this technology is developed, international and national lawmakers must work to create a legal framework to ensure that its applications are limited.

This is where students come in. We have grown up in the age of technology, and normally have a very strong grasp on understanding new technology and the implications of it. We are the students of today, but we’ll also be the lawmakers of tomorrow. Because of this, it makes sense to have us involved in the decision making at this early stage, to ensure that a comprehensive legal framework is implemented for this new technology.

You may be wondering, “How can I get involved?” It’s simple! There are lots of ways that students can effectively get involved in spreading the word about the Campaign to Stop Killer Robots, many of which are enabled by modern technology.

Reaching out to your Member of Parliament (MP)/Representative: Writing letters can be a great way to make your MP aware of an issue that you are passionate about. Students can outline their concerns about this new technology and share them with their MP. A well-written letter or email forces the MP to take a critical look at the issue, and if enough people reach out, then they will realize the importance of the issue and act.

Social Media: In the 21st century, social media has been very effective in spreading messages and pushing through change. Media sites such as Twitter or Facebook are very useful tools in spreading information on the Campaign on a national level, which would help more students and citizens get involved. As a first step, you can follow the Campaign to Stop Killer Robots on Twitter at @BanKillerRobots, on Facebook or Instagram at @StopKillerRobots. You can also use the hashtags #KillerRobots #StudentsAgainstKillerRobots #AutonomousWeapons to get involved in the conversation! You can also follow the Campaign on YouTube!

On-Campus Events: What better way to get students engaged than by offering food to them? Students who wish to get involved can get together and host a bake sale or a pizza party to raise awareness on campus for the Campaign. This can be achieved by creating on-campus Campaign representatives across Canada. These reps would be involved in promoting the Campaign on campus, whether it be through events, flyers, class engagements, or other activities. This would be an excellent method to ensure that hundreds of students have daily exposure to the Campaign.

The Campaign to Stop Killer Robots has been working hard to spread their message and to get support for a ban on fully autonomous weaponry. As students, we are the future of our world and are also well versed in technology, and as such, we should be involved in the decision making. We will make a difference in creating effective policy to ensure that this ever-evolving technology is properly regulated. If you would like more information please do not hesitate to reach out to the Campaign on social media!

Tyler Bloom interned at Mines Action Canada as the Campaign Assistant for the Campaign to Stop Killer Robots between January-April 2019. Tyler is a undergraduate student studying Political Science at the University of Ottawa. 

Turing Award Winners Call for a Ban on Killer Robots

Last week, Yoshua Bengio, Geoffrey Hinton, and Yann LeCun were named the winners of the 2018 Turing Award. The A.M. Turing Award, sometimes referred to as computing’s Nobel Prize, is given for major contributions of lasting importance to computing by the Association for Computing Machinery.

All three honourees are supporters of a ban on autonomous weapons, having signed public letters calling on states to take action. Yoshua Bengio and Geoffrey Hinton, who are both based in Canada, spoke with the Toronto Star regarding their concerns about autonomous weapons and the lack of progress on a ban.

While the award was being announced, states were meeting at the United Nations for the Group of Governmental Experts on lethal autonomous weapons systems. The meeting was marked by a minority of states delaying progress sought by the majority of states. States need to heed the concerns and warnings of experts like the 2018 Turing Award winners and the general public who do not want to see autonomous weapons developed.

There should be limits to autonomy in weapons

Statement delivered by Paul Hannon, Mines Action Canada, at the Group of Governmental Experts Meeting in March 2019

Thank you Chairperson

On Monday a delegation mentioned the Convention on Cluster Munitions. Many experiences with that life-saving treaty have relevance to this week’s discussions. Not the least of which is the importance of leaving definitions until you are at the negotiating table.

For many years the refrain by a small number of states here in the CCW was that existing IHL was sufficient to cover cluster munitions, but the large number of states who had a different view found a new pathway to address their concerns. For those who are not yet a State Party the Convention on Cluster Munitions the treaty was negotiated in 2008 and bans an indiscriminate, inhumane and unreliable weapon system that cause unacceptable harm to civilians. It entered into force and became a new part of International Humanitarian Law on August 1st 2010. To date 120 states have joined the CCM and even in those countries that have not yet joined, the treaty has impacts, for example, weapons manufacturers and financial institutions which have ceased to produce or invest in cluster munitions due to the reputational risk and the effects of such risk on their company’s bottom line.

Last week the latest government announced the destruction of their entire stockpile of cluster munitions joining a long list of States Parties which have destroyed their stockpiles including many of those states who spoke this week or in previous GGE sessions on weapons reviews.

Presumably, many of those states that had acquired cluster munitions did Article 36 weapons reviews, using existing IHL at the time. I say presumably because information on these reviews is not publically available.

Yet having undertaken their weapons reviews, then acquiring the weapon, and in some cases using it, they ended up agreeing that existing IHL at that time did need a new instrument specific to that weapon and they then joined the new cluster munitions treaty.

IHL is not static. Since the CCW negotiated Protocol IV, a preventive treaty for a weapon that had never been deployed, five other additions have been made to IHL illustrating that it does evolve and through new legally binding instruments becomes even stronger.

Mines Action Canada believes all states should undertake national weapons reviews before acquiring any weapon. We also believe current processes can be more robust and transparent. Public trust in the viability and acceptability of weapons is very important. Mines Action Canada would be pleased to assist any international efforts to improve these reviews, but that is not the work of this GGE.

Compared to autonomous weapons systems cluster munitions are technologically a much more straightforward weapon. It is hard, therefore, to reconcile the fact that existing IHL was not sufficient for cluster munitions and required a new addition to IHL, but somehow it is sufficient for a new weapon system with emerging, unproven, untested, and unpredictable technologies.

Our experience with the cluster munitions and other weapons leads to the inevitable conclusion that from a humanitarian perspective Article 36 weapons reviews on their own are insufficient to protect civilian populations from autonomous weapons systems.

New international law is needed to address the multitude of concerns with autonomous weapon systems. We believe this is possible by 2020 and urge High Contracting Parties to agree to a negotiating mandate in November.

Chairperson, the ICRC is well known for reminding us that even wars have limits.  Mines Action Canada believes that the same applies to autonomy in weapons. Even with autonomy there are limits. It is time to negotiate another legally binding instrument, either here or elsewhere, for three key reasons: firstly to protect civilians; secondly to ensure that research and development of the beneficial uses of these new technologies continues and are not tainted by the stigmatizing impact of fully autonomous weapons; and, finally to come to a common agreement on how retaining meaningful human control will help define those limits to autonomy.

Thank you.

61% opposed to development of Killer Robots

opposition-is-rising-1-1024x701

  • In the 26 countries surveyed in 2018, more than three in every five people (61%) responding to a new poll oppose the development of weapons systems that would select and attack targets without human intervention.
  • Two-thirds (66%) of those opposed to lethal autonomous weapons systems were most concerned that they would “cross a moral line because machines should not be allowed to kill.”
  • More than half (54%) of those opposed said that they were concerned that the weapons would be “unaccountable.”
  • A near-identical survey in 23 countries by the same company in January 2017 found that 56% of respondents were opposed to lethal autonomous weapons systems, opposition has grown to 61% as of December 2018.
  • A majority opposed killer robots in Canada (60%); China (60%); Russia (59%); the UK (54%); and the US (52%).

The results of this poll show that public sentiment is against the development of killer robots. A minority of states at the 2018 November meeting of the annual Convention on Conventional Weapons at the UN in Geneva, used consensus rules to thwart meaningful diplomatic progress. Russia, Israel, South Korea, and the United States indicated at the meeting that they would not support negotiations for a new treaty. Currently, 28 states are seeking to ban fully autonomous weapons. Austria, Brazil, and Chile have formally proposed the urgent negotiation of “a legally-binding instrument to ensure meaningful human control over the critical functions” of weapons systems.

more-states-calling-1024x655

Mary Wareham of Human Rights Watch, Coordinator of the Campaign to Stop Killer Robots, said:

“The window to prevent the development of fully autonomous weapons is closing fast. This poll shows that public opposition is rising and with it the expectation that governments will act decisively and with urgency to deal with this concern.”

“The results of this poll show that public views in nations often identified as most in favour of killer robots, such as the US and Russia oppose the development of these weapons.

“Efforts to address concerns over killer robots via the Convention on Conventional Weapons have failed to move to negotiate new law due to the objections of a handful of states.”

Six out of ten Canadians respondents to the poll opposed the development of weapons systems that would select and attack targets without human intervention commonly known as autonomous weapons systems or killer robots. Canadian opposition to autonomous weapons was on par with the global results but support for such weapons was below the global average with only 15 percent of Canadian respondents supporting the use of autonomous weapons while 25 percent were not sure.

a-clear-message-1024x817

“Canada has a history of leadership on peace and disarmament coupled with a strong AI sector who has been quite outspoken on this issue. A national ban on autonomous weapons systems and leadership internationally are in line with both a feminist foreign policy and the emphasis the government has put on AI as a future driver of the Canadian economy,” said Erin Hunt, Program Manager at Mines Action Canada, a co-founder of the Campaign to Stop Killer Robots.

“The Government of Canada should take note and listen to the voice of their people. Our shared security and humanity hinges on retaining meaningful human control over the use of force. Now is the time for political leadership to begin negotiations of a new treaty to prohibit fully autonomous weapons systems.”

The survey by the market research company Ipsos was commissioned by the Campaign to Stop Killer Robots and conducted in December 2018. Sample size 500 – 1,000 people in each country.

Forgotten Communities

Today, Mines Action Canada launched a new briefing paper on bias, intersectionality and autonomous weapons systems. Read the briefing note here.

The Way Forward

On August 29, 2018, Mines Action Canada delivered the following statement at the Convention on Conventional Weapons Group of Governmental Experts on Autonomous Weapons Systems

Thank you Mr. Chair. As co-founders of the Campaign to Stop Killer Robots, Mines Action Canada urge states to start negotiating new international law to create a new treaty to ban fully autonomous weapons and retain meaningful human control over the use of force.

As an organization from Canada, which has put a focus on Artificial Intelligence as a driver of our future economy, we see the prohibition of autonomous weapons systems as safeguarding public trust in AI and robotics.20180829_175010

This year’s Canadian trust survey by Proof – a polling and public relations firm – found that that only 39 per cent of Canadians trust that artificial intelligence will contribute positively to the Canadian economy, and even fewer women believe this to be true, at 36 per cent. Also, it found that only 25% of Canadians trust AI companies to do what is best for Canada and Canadians. These levels of public trust will present a problem for the commercial success of AI in the future.

Public trust in the technology is absolutely crucial to the transition from cool techy thing to an integral part of our lives. If the technology is weaponized, it will be so much harder to become useful part of our lives.

At yesterday’s side event, the panelist from the Future of Life Institute clearly outlined how scientists and relevant subject matter experts are concerned about the risks to the reputation of new technology from autonomous weapons systems and they are not worried about any risks a ban would pose to dual use technology. Protocol 4 of the CCW and the Chemical Weapons Convention has shown us that weapons can be prohibited without risking the development of beneficial technology.

I will conclude with a few words about the concept of risk. We have spent over five years, discussing autonomous weapons systems here at the CCW and throughout these talks, experts from around the world have outlined the risks posed by autonomous weapon systems – technological risks, humanitarian risks, security risks, moral risks and political risks. It is very clear that there are significant risks in developing autonomous weapons systems.

As we heard at a side event in April, the financial community makes its decisions based on risk, if an investment is too risky, you don’t do it even if the potential for a big payoff is there. We are constantly surprised that after hearing so many expert assessments that this technology poses high risks to civilian populations, some states are still object to development new IHL to prohibit AWS because maybe there will be tangential benefits from the technology.

It’s one thing to risk money, it’s another completely to risk other people’s lives.

High contracting parties should make the responsible choice in the face of overwhelming risk and start negotiating new international law to create a new treaty to ban fully autonomous weapons.

Thank you

What’s Gender Got to Do With It?

Mines Action Canada, Project Ploughshares, the Women’s International League for Peace and Freedom, the International Committee for Robot Arms Control with the Government of Canada are hosting a briefing event during the Convention on Certain Conventional Weapons Group of Governmental Experts Meeting in Geneva. The lunch time event will be held on Thursday August 30 and for more details please see the flyer.

Autonomous Weapons and the Future of Warfare

The discussion about autonomous weapons is coming to Ottawa!

Join us on August 16th to hear from national experts about the future of warfare and Canada’s role.

Reserve your place online.

 

FutureofWarfarefinal

Meet Isabelle, the Campaign to Stop Killer Robot’s Project Officer

The Campaign to Stop Killer Robots has hired their first full time staff person. Isabelle Jones is the Campaign to Stop Killer Robot’s Project Officer based in Ottawa with Mines Action Canada. As she gets settled into her new role, we sat down to chat.Isabelle 1

MAC: You have an educational and work background in human rights and global development.  When (and why) did you become interested in disarmament issues? 

IJ: In my fourth year of undergraduate studies in Global Development, I moved towards focusing my studies on the intersection of development and conflict – how development happens or stalls in complex contexts, fragile regions, and in the post-conflict period. In one of my classes we watched a clip from a documentary on the impact that landmines have had – and continue to have – in Cambodia. I was already a little familiar with landmines and landmine action after participating in a Red Cross presentation on the topic, but watching that documentary it seemed to click that these weapons weren’t just devastating at the point of detonation, but could continue to impact the development of communities, regions, and even countries long after conflict ends.

After class, some quick Internet searching led me to the International Campaign to Ban Landmines (ICBL), and then the Mines Action Canada (MAC) website. Learning about MAC’s work and their youth internship program, I decided that the 8-month internship would be the perfect post-graduation work opportunity. I could take the year to learn more about humanitarian disarmament and the long process of recovery that follows conflict, and then apply to grad school. Unfortunately, timing was not on my side. The start date for the program shifted and I wouldn’t complete my program in time to be eligible, but that interest in disarmament work never went away

MAC: And your interest in weapons technology?

IJ: I started thinking more and more about weapons technology. How has military technology, and the militarization of technology evolved since the laws of war were codified? How does this impact the lives and rights of civilians? And what does it say about how society views and values the human cost of war? I applied for my Master’s program with a proposal to research the use of drone technology in international and non-international armed conflicts, and the implications of this technology for international human rights and international humanitarian law. Over the course of my research my focus shifted slightly, and ultimately my dissertation argued that drone technology is deployed within a modern, bureaucratized system of labour – an institutional structure that can condition, shape and blind people to partake in morally, ethically and legally questionable acts of violence.

MAC: How did you learn about the Campaign to Stop Killer Robots?

IJ: Several members of the Campaign to Stop Killer Robots, like Article 36, HRW, ICRAC and PAX, have written and published research on armed drones, so I came across them in my dissertation research. This led me to learn about the work of the campaign, which I continued to follow throughout my studies and after their completion. I saw the proliferation of armed drones as a precursor to the lethal autonomous weapons systems that the campaign works to prohibit, and agreed with the campaign’s stance that it is essential to maintain human control over combat weapons. I have followed the work of the campaign closely and am honoured to be joining such a dedicated, passionate team of campaigners!

MAC: You will be working out of the Mines Action Canada office. What do you know about MAC’s work in humanitarian disarmament?

IJ: For decades MAC has been a leader in the global disarmament community, playing key roles in the International Campaign to Ban Landmines, Cluster Munition Coalition and (of course) the Campaign to Stop Killer Robots. Working nationally and internationally, MAC seeks to eliminate the consequences of indiscriminate weapons – weapons that cannot differentiate between civilians and combatants, and legitimate or illegitimate targets. This is the work that first sparked my interest and passion in humanitarian disarmament. After first hoping to become a MAC intern all those years ago, I am thrilled to now be working out of the Mines Action Canada office.

MAC: What are you most looking forward to in your new job?

IJ: I am most looking forward to the variety of the work. There is something very exciting about working in an environment where every day is a little different and there are always new challenges and opportunities to learn landing on your desk – which I think is part of the nature of working on a campaign that is small on staff, big on goals!

MAC: What do you like doing in your spare time? 

IJ: In my spare time I love getting outdoors − camping, hiking, canoeing, scuba diving – and exploring new places through travel. Next on my travel bucket list is hiking in the Patagonia region of Chile. I am also an avid reader and you can often find me curled up on the couch with a new book, or re-reading one of my favourites for the umpteenth time.

 

Opening Statement for the April 2018 Group of Governmental Experts

Delivered by Paul Hannon, Executive Director

Thank you Mr. Chairman. As a co-founder of the Campaign to Stop Killer Robots and a long-time advocate for humanitarian disarmament, Mines Action Canada supports the statement delivered by the Campaign’s Coordinator.

In many ways 2017 was a lost year for efforts to prohibit autonomous weapons here so we are hoping to see significant progress at the CCW in 2018.20180409_161410_resized

Outside of these walls though, the conversation about autonomous weapons progressed at the end of 2017 and the start of 2018.

In November, over 200 Canadian Artificial Intelligence experts released an open letter to Prime Minister Justin Trudeau calling for Canadian leadership on autonomous weapons systems. These Canadian experts are still waiting for a response from the government of Canada. Similar national letters have been released in Australia and Belgium.

Two weeks ago the G7 Innovation Ministers released a Statement on Artificial Intelligence which cited the need to increase trust in AI and included a commitment to “continue to encourage research, including […] examining ethical considerations of AI.”.

This week should provide opportunity for states to share and expand on their positions with regards to autonomous weapons systems and the need for meaningful human control. States should not overlook the ethical, humanitarian and human rights concerns about autonomous weapons systems as we delve into some technical topics.

Mr. President, CCW protocols have a history of addressing the ethical and humanitarian concerns about weapons. Protocol IV on blinding laser weapons is particularly relevant to our discussions. As a pre-emptive prohibition on an emerging technology motivated by ethical concerns, Protocol IV has been very effective in preventing the use of an abhorrent weapon without limiting the development of laser technology for other purposes including other military purposes. It is important to note that Protocol IV has some of the widest membership of all the protocols including all five permanent members of the United Nations Security Council, all the states that have chaired the autonomous weapons talks here at the CCW and most of the states who have expressed views about autonomous weapons. All those states are party to a Protocol that banned for ethical reasons a weapon before it was ever deployed in conflict.

Above all, we hope that the states present this week will reflect on the concept of responsibility. The Government of Poland’s working paper which discusses this topic is a useful starting point. We see responsibility as a theme that runs throughout these discussions.

A Canadian godfather of Artificial Intelligence has often spoken of the need to pursue responsible AI. Responsible AI makes life better for society and helps “prevent the misuse of AI applications that could cause harm” as noted in the G7 Annex.

We have been entrusted with a great responsibility here in this room. We have the responsibility to set boundaries and prevent future catastrophes. We must be bold in our actions or we could face a situation where computer programmers become de facto policy makers.

Above all, as part of our collective humanity, we must remain responsible for our actions – we cannot divest control to one of our creations whether it is in our daily actions, or more crucially for this week’s discussion, in our decisions to use weapons.

In the past, those sitting in these seats have met their responsibility to “continue the codification and progressive development of the rules of international law applicable in armed conflict” by negotiating new protocols and in the case of blinding laser weapons a pre-emptive protocol. Now it is our turn and this is our issue to address.

Thank you.

Download the full statement here