Blog Archives
2016: A year for action
We’re almost a month into 2016 and autonomous weapons systems have already been in the news thanks to a strong panel discussion at the World Economic Forum in Davos. The Campaign to Stop Killer Robots was pleased to see the panel agree that the world needs a diplomatic process to pre-emptively ban autonomous weapons systems started soon. You can read the whole analysis by the Campaign’s coordinator here.
Yes 2016 is starting on a high note for the campaign but this is not the time to be complacent. We need to keep that momentum going internationally and here in Canada. The new government has yet to share a national policy on autonomous weapons systems. Before the election, the Liberal Party of Canada wrote that:
“Emerging technologies such as Lethal Autonomous Weapon Systems pose new and serious ethical questions that must be studied and understood. The Liberal Party of Canada will work with experts and civil society to ensure that the Canadian Government develops appropriate policies to address the use and proliferation of autonomous weapon systems.”
Now that the Liberals form the government, they will have to develop “appropriate policies” soon because the international community is moving forward, albeit verrrrrry slowly. States are meeting in April 2016 for a third (and hopefully final) informal experts meeting on autonomous weapons systems under the United Nations’ Convention on Conventional Weapons and then at the end of the year, states will have the opportunity to start negotiations on a pre-emptive ban. The UN process has been called “glacial” and that it “shows no sense of urgency” but there’s time for states to pick up the pace and Canada can take a leadership role.
Canadian industry, academics and NGOs have already taken a leadership role on banning autonomous weapon systems so now it’s the government’s turn. The Canadian government and Prime Minister Trudeau made a big impression at the World Economic Forum so we hope that they will take that energy forward to act on one of newist issues discussed there. Let’s make 2016 a year of action on autonomous weapons systems.
Video Contest – Runner Up Announced
In less than two weeks, states will decide if and how they will continue international talks on autonomous weapons systems at the UN`s Convention on Conventional Weapons in Geneva. We and the whole Campaign to Stop Killer Robots are calling on states to take the next step towards a ban by agreeing to a Group of Governmental Experts.
With such an important decision looming over states, we are launching the winners of our youth video contest. This week, we are pleased to present the runner-up video (and top high school video) by Daryl, Henry, Joseph and Anders at Petersburg High School.
Please feel free to share widely!
We thank all those who submitted videos to the contest and congratulate Daryl, Henry, Joseph and Anders on their excellent video. Come back next week to see the winning entry.
New Video on Human Control
This week states are meeting at the United Nations in Geneva to decide if discussion on lethal autonomous weapons systems will continue at the Convention on Conventional Weapons. States should continue to discuss this issue and to debate key problems with autonomous weapons systems. One of the key problems is the issue of human control. Learn more with this new video.
Campaign to Stop Killer Robots Summer Recap
The Campaign to Stop Killer Robots has been trundling along all summer sharing our message, reaching out to governments and gaining new supporters,.
There have been some exciting and important developments over the summer. The International Committee of the Red Cross (ICRC) launched the newest edition of the International Review of the Red Cross and the theme is New Technologies and Warfare. A number of campaigners contributed to the journal so it is definitely worth a read. The ICRC also published a Frequently Asked Questions document on autonomous weapons that helps explain the issue and the ICRC’s position on fully autonomous weapons.
France along with the United Nations Office for Disarmament Affairs in Geneva convened a seminar on fully autonomous weapons for governments and civil society in early September. The Campaign to Stop Killer Robots had campaigners taking part and you can read the full report on the global campaign’s website.
The campaigns in Germany and Norway are starting off strong as well. In the lead up to the German election, all the major parties shared their policy positions in regards to fully autonomous weapons with our colleagues at Facing Finance. Norwegian campaigners launched their campaign with a breakfast seminar and now they are waiting to hear what the new Norwegian government’s policy on fully autonomous weapons will be.
Like our colleagues in Norway, we’re still waiting to hear what Canada’s policy on fully autonomous weapons will be. We have written to the Ministers of National Defense and of Foreign Affairs but the campaign team has not yet heard back. In the meantime, Canadians can weigh in on the topic through our new online petition. Share and sign the petition today! This petition is the first part of a new initiative that will be coming your way in a few weeks. Keep your eye out for the news and until then keep sharing the petition so that the government knows that Canadians have concerns about fully autonomous weapons and believe that Canada should have a strong policy against them.
EDIT: We had a very human moment here and forgot to include congratulations to James Foy of Vancouver for winning the 2013 Canadian Bar Association’s National Military Law Section Law School Sword and Scale Essay Prize for his essay called Autonomous Weapons Systems: Taking the Human out of International Humanitarian Law. It is great to see law students looking at this new topic and also wonderful that the Canadian Bar Association recognized the importance of this issue. Congratulations James!
UK Parliament Debates Fully Autonomous Weapons
Last month at the United Nations Human Rights Council, we were slightly concerned when the UK was the only state opposed to a moratorium or a ban on fully autonomous weapons. After a parliamentary debate on June 17, 2013, we have a little more clarity. In response to a speech by Nia Griffith, MP, the Minister for Counter Proliferation, Alistair Burt MP, agreed that fully autonomous weapons will not “be able to meet the requirements of international humanitarian law” and stressed that the UK does not have fully autonomous weapons and does not plan to acquire any.
Our colleagues at Article 36 have done a detailed analysis of the debate. In light of the stronger language in this debate, there is some room to optimistic
It would seem straightforward to move from such a strong national position to a formalised national moratorium and a leading role within an international process to prohibit such weapons. The government did not provide any reason as to why a moratorium would be inappropriate, other than to speculate on the level of support amongst other countries for such a course of action.
Whilst significant issues still require more detailed elaboration, Article 36 believes this parliamentary debate has been very valuable in prompting reflection and Ministerial scrutiny of UK policy on fully autonomous weapons and narrowing down the areas on which further discussions should focus. It appears clear now that there will be scope for such discussions to take place with the UK and other states in the near future.
The UK parliamentary debate and Article 36’s analysis of it, coming so soon after the Human Rights Council debate and the widespread media coverage of the issue make it quite clear that it is time to have such a substantive and non-partisan debate in the Canadian House of Commons as the government works out its policy on this important issue.
Related articles
- Avoiding Rabbit Holes Through Policy and Law (stopkillerrobots.ca)
Asimov’s Three Laws of Robotics
In the weeks since the Campaign to Stop Killer Robots launched, there has been a lot of media coverage. The media coverage is very exciting and what I have found to be very interesting is the number of articles that refer to Isaac Asimov’s Three Laws of Robotics.
Now unless like me you grew up with a sci-fi geek for a father who introduced you to various fictional worlds like those in Star Wars, Star Trek and 2001: A Space Odyssey at a young age, you might not know who Isaac Asimov is, what his Three Laws of Robotics are and why these laws are relevant to the Campaign to Stop Killer Robots.
Isaac Asimov (1920-1992) was an American scientist and writer, best known for his science fiction writings especially short stories. In his writings, Asimov created the Three Laws of Robotics which govern the action of his robot characters. In his stories, the Three Laws were programmed into robots as a safety function. The laws were first stated in the short story Runaround but you can see them in many of his other writings and since then they have shown up in other authors’ work as well.
The Three Laws of Robotics are:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
After reading the Three Laws, it might be pretty clear why Mr. Asimov’s ideas are frequently mentioned in media coverage of our campaign to stop fully autonomous weapons. A fully autonomous weapon will most definitely violate the first and second laws of robotics.
To me, the Three Laws seem to be pretty common sense guides for the actions of autonomous robots. It is probably a good idea to protect yourself from being killed by your own machine – ok not probably – it is a good idea to make sure your machine does not kill you! It also is important for us to remember that Asimov recognized that just regular robots with artificial intelligence (not even fully autonomous weapons) could pose a threat to humanity at large so he also added a fourth, or zeroth law, to come before the others:
0. A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
“But Erin,” you say, “these are just fictional stories; the Campaign to Stop Killer Robots is dealing with how things really will be. We need to focus on reality not fiction!” I hear you but since fully autonomous weapons do not yet exist we need to take what we know about robotics, warfare and law and add a little imagination to foresee some of the possible problems with fully autonomous weapons. Who better to help us consider the possibilities than science fiction writers who have been thinking about these types of issues for decades?
At the moment, Asimov’s Three Laws are currently the closest thing we have to laws explicitly governing the use of fully autonomous weapons. Asimov’s stories often tell tales of how the application of these laws result in robots acting in weird and dangerous ways the programmers did not predict. By articulating some pretty common sense laws for robots and then showing how those laws can have unintended negative consequences when implemented by artificial intelligence, Asimov’s writings may have made the first argument that a set of parameters to guide the actions of fully autonomous weapons will not be sufficient. Even if you did not have a geeky childhood like I did, you can still see the problems with creating fully autonomous weapons. You don’t have to read Asimov, know who HAL is or have a disliking for the Borg to worry that we won’t be able to control how artificial intelligence will interpret our commands and anyone who has tried to use a computer, a printer or a cell phone knows that there is no end to the number of ways technology can go wrong. We need a pre-emptive ban on fully autonomous weapons before it is too late and that is what the Campaign to Stop Killer Robots will be telling the diplomats at the UN in Geneva at the end of the month.
– Erin Hunt, Program Officer