There are countless issues facing youth worldwide today. We are on the forefront of several causes – climate change, gun violence, women’s rights, racial equality, freedom of expression, and economic issues all have made headlines recently as young people took to the streets in droves as advocates for their futures. My generation has the ability to plan and instigate mass demonstrations (due to our access to internet communication networks globally), and we have put our resources to good use. Even during the pandemic, youth demonstrations have continued in several countries, both in-person and over social media. Our access to communications technology and higher education is a privilege no previous generation has benefited from, and now more than ever young people are using it to organize and make their voices heard.
However, being the most technologically-advanced generation is a double-edged sword. Our personal data is permanently available on the internet. For many of us, this started in childhood, long before there were any legal regulations restricting the use of data on the internet. Today, our virtual vulnerability is beginning to inform the issues most important to youth. There is still a relative lack of legal protections that could keep our personal data safe, meaning we are at risk of all manner of technological threats. This year, activists in several countries were tracked using social media data and facial recognition software after attending protests. It is a very dystopian reality: our faces do not belong to us, and can be used against us at any time. I still remember going to Fridays for the Future marches in 2019 – not carrying identification or money, wearing a hat and large sunglasses. As a white woman living in Vancouver, Canada, I am certainly not at the same level of risk as my fellow protestors in the United States or Hong Kong – but the looming of threat of data still hangs over me. I (along with all other young activists) live with the reality that I cannot be anonymous – a fact that makes many of us hesitant to speak freely about causes that matter to us. This vulnerability has also become uncomfortably apparent in light of recent developments in weapons manufacturing.
The production of fully autonomous weapons systems (or “killer robots”) is fundamentally an issue of human rights in a digital age. The potential hazards of these systems are beyond comprehension – and beyond the scope of international law. For my generation, who have lived our entire lives with the internet, this is a serious issue, the severity of which we are only beginning to grasp.
In December of 2020, youth activists from 20 countries gathered for a virtual conference, to share their individual concerns regarding the development of killer robots. As the youth representative for Canada, I logged on to my laptop at 2:00am to join the panel, which encompassed 10 different time zones (mine being the biggest time difference). Though we suffered, at times, through internet connectivity issues and the general inferno that is the Zoom webinar platform, we persevered. I believe we all understood it was the only way – though the impact of a virtual conference may be lesser than that of a physical event, we felt a certain sense of duty to at least try.
Several risks were raised – the representative from Argentina shared her fear that Indigenous communities would be oppressed further by governments and private corporations armed with killer robots. The representative from Vietnam expressed concern over the perilous situation facing his country in the South China Sea, and the danger they would face should other militaries in the region implement these weapons. Another concern, raised by the representative from the United States, was focused on the financial cost of producing the weapons in his country, where rates of unemployment and poverty are climbing. Several of the concerns raised by my fellow speakers reflect other issues that matter to our generation: the rights of minority communities (who will be at great risk if their oppressor gains access to these weapons), freedom of expression (concerns have been raised over the safety of activists should their governments implement killer robots).
One of the concerns raised was the central point in my own speech: the issue of technological advancement and legality. Currently, international law does not have any precedent that could address the legality of killer robots. Technological development worldwide continues to be somewhat of a legal “no man’s land”, in that much of it has occurred without any restrictions to limit its scope. For killer robots, this means there is nothing regulating their development or implementation as of yet. Combined with the lack of legal regulations over the use of personal data, the risk becomes clearer. Killer robots will mostly require artificial intelligence to function, and artificial intelligence requires data to “think”. For those of us whose personal data has been available on the internet since we were children, this is a massive risk to our safety and freedom.
The most common thread among the speeches from my fellow activists was the act of calling upon our governments to take the threat of killer robots seriously. We feel that, like many issues youth are facing today, it is pushed aside in favour of more immediate concerns. Like climate change, the dangers of killer robots seem far-off to many people – they will happen someday, but that day is part of a distant, imagined future that does not concern them. For my generation, that future does not feel so distant. If killer robots are implemented as far in the future as 2040, I’ll be forty-one years old – young enough to be middle-aged, and spend the last half of my life in a world with autonomous weapons. Like climate change, the existence of killer robots is closer than we think, and young people like me know that it will affect us.
Attending the youth conference gave me a strange sense of optimism. For many of us, 2020 was a difficult year. I felt the weight of it on my shoulders as I sat in my desk chair in the middle of night to sign into Zoom, knowing that had things been different I might have been able to travel and see my fellow speakers in person. Yet somehow, I felt lighter when I finally closed my laptop screen at 6:00am. I think it was the lightness that comes from being among likeminded people – from not being alone. I was in a (virtual) room full of people my age who know about an issue that I am passionate about, and who are just as passionate as I am. The lack of human connection that comes with virtual events was a barrier, but I think we broke through it. To me, the youth conference felt like the start of something – I don’t know exactly what, but I like to think it will be good.
Tara Osler was Mines Action Canada’s Summer 2020 Research and Communications Assistant and is currently studying at the University of British Columbia.
The 21st century is marked by technological advancements – like the internet – that have changed the way we live and have made the world more connected. As a result of these advancements, particularly with the internet, privacy laws have changed to adapt to the new norms in order to ensure that technology does not harm citizens. While previously we have had to create laws to adapt to new technologies, in the case of the development of autonomous weaponry it is important that we pre-emptively create laws to limit the negative impacts associated with these technologies. This requires lawmakers to have an in-depth understanding of the technologies and their implications, otherwise they will not be able to effectively legislate.
“How does [hateful information about me] show up on a 7-year-olds iPhone?” asked Iowa representative Steve King (R) to Google CEO Sundar Pichai during his congressional testimony.
The quote above demonstrates the blatant lack of understanding of technology by a U.S. lawmaker. Representative Steve King, 69, asking the CEO of Google about an iPhone problem, which is a product of a different company, is the epitome of not understanding how technology works. This fundamental lack of understanding of technology is troubling. How can we trust lawmakers to create effective laws to limit the negative impacts of more advanced technology such as artificial intelligence (AI) when they don’t even understand basic technology like their cellphones?
The fact is, we have a role to play too. Modern issues require modern solutions, and it’s up to us to hold lawmakers accountable and ensure that they understand the implications of new technologies.
One such technology that is under development is fully autonomous systems. These systems could be used as weapons of war, for surveillance, policing, and border control, giving the government more control over the lives of citizens. Fully autonomous weapons could be drones and other remote controlled vehicles that are programmed to target certain groups indiscriminately without human intervention. While these systems aren’t in use yet, it is an urgent issue that we create legislation to limit their negative impact. If not, the ramifications of this technology would be widespread. Employing autonomous weapons systems increases the possibility that innocent civilians would be harmed during wartime, as well as increases capability for the government to surveil its population. It is important that we maintain a human perspective while in times of conflict to limit these ramifications.
The Campaign to Stop Killer Robots is an international coalition of more than 100 non-governmental organisations across 54 countries working to pre-emptively ban the use of fully autonomous weapons systems. As this technology is developed, international and national lawmakers must work to create a legal framework to ensure that its applications are limited.
This is where students come in. We have grown up in the age of technology, and normally have a very strong grasp on understanding new technology and the implications of it. We are the students of today, but we’ll also be the lawmakers of tomorrow. Because of this, it makes sense to have us involved in the decision making at this early stage, to ensure that a comprehensive legal framework is implemented for this new technology.
You may be wondering, “How can I get involved?” It’s simple! There are lots of ways that students can effectively get involved in spreading the word about the Campaign to Stop Killer Robots, many of which are enabled by modern technology.
Reaching out to your Member of Parliament (MP)/Representative: Writing letters can be a great way to make your MP aware of an issue that you are passionate about. Students can outline their concerns about this new technology and share them with their MP. A well-written letter or email forces the MP to take a critical look at the issue, and if enough people reach out, then they will realize the importance of the issue and act.
Social Media: In the 21st century, social media has been very effective in spreading messages and pushing through change. Media sites such as Twitter or Facebook are very useful tools in spreading information on the Campaign on a national level, which would help more students and citizens get involved. As a first step, you can follow the Campaign to Stop Killer Robots on Twitter at @BanKillerRobots, on Facebook or Instagram at @StopKillerRobots. You can also use the hashtags #KillerRobots #StudentsAgainstKillerRobots #AutonomousWeapons to get involved in the conversation! You can also follow the Campaign on YouTube!
On-Campus Events: What better way to get students engaged than by offering food to them? Students who wish to get involved can get together and host a bake sale or a pizza party to raise awareness on campus for the Campaign. This can be achieved by creating on-campus Campaign representatives across Canada. These reps would be involved in promoting the Campaign on campus, whether it be through events, flyers, class engagements, or other activities. This would be an excellent method to ensure that hundreds of students have daily exposure to the Campaign.
The Campaign to Stop Killer Robots has been working hard to spread their message and to get support for a ban on fully autonomous weaponry. As students, we are the future of our world and are also well versed in technology, and as such, we should be involved in the decision making. We will make a difference in creating effective policy to ensure that this ever-evolving technology is properly regulated. If you would like more information please do not hesitate to reach out to the Campaign on social media!
Tyler Bloom interned at Mines Action Canada as the Campaign Assistant for the Campaign to Stop Killer Robots between January-April 2019. Tyler is a undergraduate student studying Political Science at the University of Ottawa.
Last week, Yoshua Bengio, Geoffrey Hinton, and Yann LeCun were named the winners of the 2018 Turing Award. The A.M. Turing Award, sometimes referred to as computing’s Nobel Prize, is given for major contributions of lasting importance to computing by the Association for Computing Machinery.
All three honourees are supporters of a ban on autonomous weapons, having signed public letters calling on states to take action. Yoshua Bengio and Geoffrey Hinton, who are both based in Canada, spoke with the Toronto Star regarding their concerns about autonomous weapons and the lack of progress on a ban.
While the award was being announced, states were meeting at the United Nations for the Group of Governmental Experts on lethal autonomous weapons systems. The meeting was marked by a minority of states delaying progress sought by the majority of states. States need to heed the concerns and warnings of experts like the 2018 Turing Award winners and the general public who do not want to see autonomous weapons developed.
Statement delivered by Paul Hannon, Mines Action Canada, at the Group of Governmental Experts Meeting in March 2019
Thank you Chairperson
On Monday a delegation mentioned the Convention on Cluster Munitions. Many experiences with that life-saving treaty have relevance to this week’s discussions. Not the least of which is the importance of leaving definitions until you are at the negotiating table.
For many years the refrain by a small number of states here in the CCW was that existing IHL was sufficient to cover cluster munitions, but the large number of states who had a different view found a new pathway to address their concerns. For those who are not yet a State Party the Convention on Cluster Munitions the treaty was negotiated in 2008 and bans an indiscriminate, inhumane and unreliable weapon system that cause unacceptable harm to civilians. It entered into force and became a new part of International Humanitarian Law on August 1st 2010. To date 120 states have joined the CCM and even in those countries that have not yet joined, the treaty has impacts, for example, weapons manufacturers and financial institutions which have ceased to produce or invest in cluster munitions due to the reputational risk and the effects of such risk on their company’s bottom line.
Last week the latest government announced the destruction of their entire stockpile of cluster munitions joining a long list of States Parties which have destroyed their stockpiles including many of those states who spoke this week or in previous GGE sessions on weapons reviews.
Presumably, many of those states that had acquired cluster munitions did Article 36 weapons reviews, using existing IHL at the time. I say presumably because information on these reviews is not publically available.
Yet having undertaken their weapons reviews, then acquiring the weapon, and in some cases using it, they ended up agreeing that existing IHL at that time did need a new instrument specific to that weapon and they then joined the new cluster munitions treaty.
IHL is not static. Since the CCW negotiated Protocol IV, a preventive treaty for a weapon that had never been deployed, five other additions have been made to IHL illustrating that it does evolve and through new legally binding instruments becomes even stronger.
Mines Action Canada believes all states should undertake national weapons reviews before acquiring any weapon. We also believe current processes can be more robust and transparent. Public trust in the viability and acceptability of weapons is very important. Mines Action Canada would be pleased to assist any international efforts to improve these reviews, but that is not the work of this GGE.
Compared to autonomous weapons systems cluster munitions are technologically a much more straightforward weapon. It is hard, therefore, to reconcile the fact that existing IHL was not sufficient for cluster munitions and required a new addition to IHL, but somehow it is sufficient for a new weapon system with emerging, unproven, untested, and unpredictable technologies.
Our experience with the cluster munitions and other weapons leads to the inevitable conclusion that from a humanitarian perspective Article 36 weapons reviews on their own are insufficient to protect civilian populations from autonomous weapons systems.
New international law is needed to address the multitude of concerns with autonomous weapon systems. We believe this is possible by 2020 and urge High Contracting Parties to agree to a negotiating mandate in November.
Chairperson, the ICRC is well known for reminding us that even wars have limits. Mines Action Canada believes that the same applies to autonomy in weapons. Even with autonomy there are limits. It is time to negotiate another legally binding instrument, either here or elsewhere, for three key reasons: firstly to protect civilians; secondly to ensure that research and development of the beneficial uses of these new technologies continues and are not tainted by the stigmatizing impact of fully autonomous weapons; and, finally to come to a common agreement on how retaining meaningful human control will help define those limits to autonomy.
- In the 26 countries surveyed in 2018, more than three in every five people (61%) responding to a new poll oppose the development of weapons systems that would select and attack targets without human intervention.
- Two-thirds (66%) of those opposed to lethal autonomous weapons systems were most concerned that they would “cross a moral line because machines should not be allowed to kill.”
- More than half (54%) of those opposed said that they were concerned that the weapons would be “unaccountable.”
- A near-identical survey in 23 countries by the same company in January 2017 found that 56% of respondents were opposed to lethal autonomous weapons systems, opposition has grown to 61% as of December 2018.
- A majority opposed killer robots in Canada (60%); China (60%); Russia (59%); the UK (54%); and the US (52%).
The results of this poll show that public sentiment is against the development of killer robots. A minority of states at the 2018 November meeting of the annual Convention on Conventional Weapons at the UN in Geneva, used consensus rules to thwart meaningful diplomatic progress. Russia, Israel, South Korea, and the United States indicated at the meeting that they would not support negotiations for a new treaty. Currently, 28 states are seeking to ban fully autonomous weapons. Austria, Brazil, and Chile have formally proposed the urgent negotiation of “a legally-binding instrument to ensure meaningful human control over the critical functions” of weapons systems.
Mary Wareham of Human Rights Watch, Coordinator of the Campaign to Stop Killer Robots, said:
“The window to prevent the development of fully autonomous weapons is closing fast. This poll shows that public opposition is rising and with it the expectation that governments will act decisively and with urgency to deal with this concern.”
“The results of this poll show that public views in nations often identified as most in favour of killer robots, such as the US and Russia oppose the development of these weapons.
“Efforts to address concerns over killer robots via the Convention on Conventional Weapons have failed to move to negotiate new law due to the objections of a handful of states.”
Six out of ten Canadians respondents to the poll opposed the development of weapons systems that would select and attack targets without human intervention commonly known as autonomous weapons systems or killer robots. Canadian opposition to autonomous weapons was on par with the global results but support for such weapons was below the global average with only 15 percent of Canadian respondents supporting the use of autonomous weapons while 25 percent were not sure.
“Canada has a history of leadership on peace and disarmament coupled with a strong AI sector who has been quite outspoken on this issue. A national ban on autonomous weapons systems and leadership internationally are in line with both a feminist foreign policy and the emphasis the government has put on AI as a future driver of the Canadian economy,” said Erin Hunt, Program Manager at Mines Action Canada, a co-founder of the Campaign to Stop Killer Robots.
“The Government of Canada should take note and listen to the voice of their people. Our shared security and humanity hinges on retaining meaningful human control over the use of force. Now is the time for political leadership to begin negotiations of a new treaty to prohibit fully autonomous weapons systems.”
The survey by the market research company Ipsos was commissioned by the Campaign to Stop Killer Robots and conducted in December 2018. Sample size 500 – 1,000 people in each country.
Today, Mines Action Canada launched a new briefing paper on bias, intersectionality and autonomous weapons systems. Read the briefing note here.
On August 29, 2018, Mines Action Canada delivered the following statement at the Convention on Conventional Weapons Group of Governmental Experts on Autonomous Weapons Systems
Thank you Mr. Chair. As co-founders of the Campaign to Stop Killer Robots, Mines Action Canada urge states to start negotiating new international law to create a new treaty to ban fully autonomous weapons and retain meaningful human control over the use of force.
As an organization from Canada, which has put a focus on Artificial Intelligence as a driver of our future economy, we see the prohibition of autonomous weapons systems as safeguarding public trust in AI and robotics.
This year’s Canadian trust survey by Proof – a polling and public relations firm – found that that only 39 per cent of Canadians trust that artificial intelligence will contribute positively to the Canadian economy, and even fewer women believe this to be true, at 36 per cent. Also, it found that only 25% of Canadians trust AI companies to do what is best for Canada and Canadians. These levels of public trust will present a problem for the commercial success of AI in the future.
Public trust in the technology is absolutely crucial to the transition from cool techy thing to an integral part of our lives. If the technology is weaponized, it will be so much harder to become useful part of our lives.
At yesterday’s side event, the panelist from the Future of Life Institute clearly outlined how scientists and relevant subject matter experts are concerned about the risks to the reputation of new technology from autonomous weapons systems and they are not worried about any risks a ban would pose to dual use technology. Protocol 4 of the CCW and the Chemical Weapons Convention has shown us that weapons can be prohibited without risking the development of beneficial technology.
I will conclude with a few words about the concept of risk. We have spent over five years, discussing autonomous weapons systems here at the CCW and throughout these talks, experts from around the world have outlined the risks posed by autonomous weapon systems – technological risks, humanitarian risks, security risks, moral risks and political risks. It is very clear that there are significant risks in developing autonomous weapons systems.
As we heard at a side event in April, the financial community makes its decisions based on risk, if an investment is too risky, you don’t do it even if the potential for a big payoff is there. We are constantly surprised that after hearing so many expert assessments that this technology poses high risks to civilian populations, some states are still object to development new IHL to prohibit AWS because maybe there will be tangential benefits from the technology.
It’s one thing to risk money, it’s another completely to risk other people’s lives.
High contracting parties should make the responsible choice in the face of overwhelming risk and start negotiating new international law to create a new treaty to ban fully autonomous weapons.
Mines Action Canada, Project Ploughshares, the Women’s International League for Peace and Freedom, the International Committee for Robot Arms Control with the Government of Canada are hosting a briefing event during the Convention on Certain Conventional Weapons Group of Governmental Experts Meeting in Geneva. The lunch time event will be held on Thursday August 30 and for more details please see the flyer.
The discussion about autonomous weapons is coming to Ottawa!
Join us on August 16th to hear from national experts about the future of warfare and Canada’s role.
The Campaign to Stop Killer Robots has hired their first full time staff person. Isabelle Jones is the Campaign to Stop Killer Robot’s Project Officer based in Ottawa with Mines Action Canada. As she gets settled into her new role, we sat down to chat.
MAC: You have an educational and work background in human rights and global development. When (and why) did you become interested in disarmament issues?
IJ: In my fourth year of undergraduate studies in Global Development, I moved towards focusing my studies on the intersection of development and conflict – how development happens or stalls in complex contexts, fragile regions, and in the post-conflict period. In one of my classes we watched a clip from a documentary on the impact that landmines have had – and continue to have – in Cambodia. I was already a little familiar with landmines and landmine action after participating in a Red Cross presentation on the topic, but watching that documentary it seemed to click that these weapons weren’t just devastating at the point of detonation, but could continue to impact the development of communities, regions, and even countries long after conflict ends.
After class, some quick Internet searching led me to the International Campaign to Ban Landmines (ICBL), and then the Mines Action Canada (MAC) website. Learning about MAC’s work and their youth internship program, I decided that the 8-month internship would be the perfect post-graduation work opportunity. I could take the year to learn more about humanitarian disarmament and the long process of recovery that follows conflict, and then apply to grad school. Unfortunately, timing was not on my side. The start date for the program shifted and I wouldn’t complete my program in time to be eligible, but that interest in disarmament work never went away
MAC: And your interest in weapons technology?
IJ: I started thinking more and more about weapons technology. How has military technology, and the militarization of technology evolved since the laws of war were codified? How does this impact the lives and rights of civilians? And what does it say about how society views and values the human cost of war? I applied for my Master’s program with a proposal to research the use of drone technology in international and non-international armed conflicts, and the implications of this technology for international human rights and international humanitarian law. Over the course of my research my focus shifted slightly, and ultimately my dissertation argued that drone technology is deployed within a modern, bureaucratized system of labour – an institutional structure that can condition, shape and blind people to partake in morally, ethically and legally questionable acts of violence.
MAC: How did you learn about the Campaign to Stop Killer Robots?
IJ: Several members of the Campaign to Stop Killer Robots, like Article 36, HRW, ICRAC and PAX, have written and published research on armed drones, so I came across them in my dissertation research. This led me to learn about the work of the campaign, which I continued to follow throughout my studies and after their completion. I saw the proliferation of armed drones as a precursor to the lethal autonomous weapons systems that the campaign works to prohibit, and agreed with the campaign’s stance that it is essential to maintain human control over combat weapons. I have followed the work of the campaign closely and am honoured to be joining such a dedicated, passionate team of campaigners!
MAC: You will be working out of the Mines Action Canada office. What do you know about MAC’s work in humanitarian disarmament?
IJ: For decades MAC has been a leader in the global disarmament community, playing key roles in the International Campaign to Ban Landmines, Cluster Munition Coalition and (of course) the Campaign to Stop Killer Robots. Working nationally and internationally, MAC seeks to eliminate the consequences of indiscriminate weapons – weapons that cannot differentiate between civilians and combatants, and legitimate or illegitimate targets. This is the work that first sparked my interest and passion in humanitarian disarmament. After first hoping to become a MAC intern all those years ago, I am thrilled to now be working out of the Mines Action Canada office.
MAC: What are you most looking forward to in your new job?
IJ: I am most looking forward to the variety of the work. There is something very exciting about working in an environment where every day is a little different and there are always new challenges and opportunities to learn landing on your desk – which I think is part of the nature of working on a campaign that is small on staff, big on goals!
MAC: What do you like doing in your spare time?
IJ: In my spare time I love getting outdoors − camping, hiking, canoeing, scuba diving – and exploring new places through travel. Next on my travel bucket list is hiking in the Patagonia region of Chile. I am also an avid reader and you can often find me curled up on the couch with a new book, or re-reading one of my favourites for the umpteenth time.