Monthly Archives: April 2013
Meet David Wreckham – Robot Campaigner
David Wreckham is a friendly robot campaigning for a ban on killer robots. See him in action during the launch of the Campaign to Stop Killer Robots in London last week. You can follow David Wreckham on Twitter.
(c) Sharron Ward for the campaign, 23 April 2013.Learning from the past, protecting the future
A key lesson learned from the Canadian led initiative to ban landmines is to not wait until there is a global crisis before taking action. Fifteen years after the Ottawa Treaty banning landmines was opened for signatures there has been remarkable success. However, due to the widespread use of the weapon before the ban treaty became international law it has taken a considerable amount of effort and resources to lessen that international crisis down to national problem status. Much work remains, but all the trend lines are positive. With continued political will combined with sustained funding this is a crisis that is solvable.
That lesson of taking action before a global crisis exists was an important factor in the Norwegian led initiative to ban cluster munitions. Although a much more high tech weapon than landmines, cluster munitions have caused unacceptable humanitarian harm when they have been used. The indiscriminate effects and the impact they have on innocent civilians resulted in cluster munitions being banned. Fortunately, cluster bombs have not been as widely used as landmines so the 2008 Convention on Cluster Munitions (CCM) is very much a preventive treaty. With tens of millions of cluster submunitions, also known as bomblets, having been destroyed from the stockpiles of states parties to the treaty, the preventive nature of the CCM is already saving countless lives, limbs and livelihoods. However, as with the landmines the use of cluster munitions that had taken place before the treaty came into force means there is much work remaining to clear the existing contamination and the help victims rebuild their shattered lives.
Both landmines and cluster munitions were considered advanced weapons in their day. Landmines were sometimes referred to as the ‘perfect soldier’, but once planted they could not tell the difference between a child or a combatant. Cluster munitions were a much more expensive and sophisticated weapon than landmines yet once dropped or launched the submunitions dispersed from the carrier munition could not distinguish between a soldier and a civilian. Cluster submunitions also had high failure rates and did not explode upon impact as designed leaving behind de facto minefields.
Both landmines and cluster munitions shared the characteristic of not knowing when the conflict had ended so they continued to kill and injure long after peace had happened. In many cases they continued their destructive tasks decades after hostilities had ceased.
Another characteristic they shared is that once humans were no longer involved, i.e. after planting or firing them, the impact of the weapons became immediately problematic. With no human control over whom the target was or when an explosion would occur resulted in a weapons that was indiscriminate by nature which was a key factor in the movements to ban them.
Today in London, England a new campaign will be launched taking the concept of prevention to its full extent by banning a weapon that is not yet in use. Fully autonomous weapons are very much on the drawing boards and in the plans of technologically advanced militaries such as China, Russia, the UK and the US. These weapons pose a wide range of ethical, moral, and legal issues. The Campaign to Stop Killer Robots seeks to raise awareness of those issues and to encourage a pre-emptive ban on the weapons.
Over the past decade, the expanded use of unmanned armed vehicles or drones has dramatically changed warfare, bringing new humanitarian and legal challenges. Now rapid advances in technology are permitting the United States and other nations with high-tech militaries, including China, Israel, Russia, and the United Kingdom, to move toward systems that would give full combat autonomy to machines.
Lethal robot weapons which would be able to select and attack targets without any human intervention take warfare to dangerous and unacceptable levels. The new campaign launched today is a coordinated international coalition of non-governmental organizations concerned with the implications of fully autonomous weapons, also called “killer robots.”
The Campaign to Stop Killer Robots calls for a pre-emptive and comprehensive ban on the development, production, and use of fully autonomous weapons. The prohibition should be achieved through an international treaty, as well as through national laws and other measures.
The term fully autonomous weapons may sound like something from a video game, but they are not. They are lethal weapons and once programmed will not be controlled by anyone. While some may find the idea of machines fighting machines with humans spared the death and destruction of combat appealing, the fact is that will not be the case. We are not talking here about futuristic cyborgs battling each other to death, but about robots designed to kill humans. Thus the name killer robots is simultaneously deadly accurate and highly disturbing.
We live in a world where technology is omnipresent, but we are also well aware of its limitations. While we enjoy the benefits of technology and appreciate those who create and operate them, we also well aware that airplanes sometimes crash, trains derail, ships run aground, cars get recalled, the internet occasionally blacks out (as do power grids), computers freeze, viruses spread via email messages or websites, and, people occasionally end up in the wrong place because of a malfunctioning or poorly programmed GPS device. To use the vernacular “shit happens” or in this case hi-tech shit happens. What could possibly go wrong with arming robots without any meaningful human control?
It would also be comforting to think that since these are very advanced weapons only the “good guys” would have them. However, events in the last two years in Libya, North Korea and Syria, to name a few, would indicate that desperate dictators and rogue states have no problems acquiring the most sophisticated and hi-tech weaponry. If they can get them so can terrorists and criminals.
Scientists and engineers have created some amazing robots which have the potential to greatly improve our lives, but no scientist or engineer should be involved in creating an armed robot that can operate without human control. Computer scientists and engineers have created fabulous devices which have increased our productivity and made life much more enjoyable for millions of people. Those computer experts should never create programs that would allow an armed machine to operate without any human in control.
The hundreds of thousands of landmine and cluster munition victims around the world are testament to the fact that what looks good on the drawing board or in the lab can have deadly consequences for innocent civilians; despite the best intentions or even the best technology that money can buy. We need to learn the key lesson of these two weapons that tragedies can and should be prevented. The time to stop fully autonomous weapons does not begin next week, or next month, or during testing, or after their first use. The time to stop killer robots begins today April 23, 2013 in London, England and wherever you are reading this.
– Paul Hannon
Press Release – Urgent Action Needed to Ban Fully Autonomous Weapons
Non-governmental organizations convene to launch Campaign to Stop Killer Robots
(London, April 23, 2013) – Urgent action is needed to pre-emptively ban lethal robot weapons that would be able to select and attack targets without any human intervention, said a new campaign launched in London today. The Campaign to Stop Killer Robots is a coordinated international coalition of non-governmental organizations concerned with the implications of fully autonomous weapons, also called “killer robots.”
The Campaign to Stop Killer Robots calls for a pre-emptive and comprehensive ban on the development, production, and use of fully autonomous weapons. The prohibition should be achieved through an international treaty, as well as through national laws and other measures.
“Allowing life or death decisions on the battlefield to be made by machines crosses a fundamental moral line and represents an unacceptable application of technology,” said Nobel Peace Laureate Jody Williams of the Nobel Women’s Initiative. “Human control of autonomous weapons is essential to protect humanity from a new method of warfare that should never be allowed to come into existence.”
Over the past decade, the expanded use of unmanned armed vehicles or drones has dramatically changed warfare, bringing new humanitarian and legal challenges. Now rapid advances in technology are permitting the United States and other nations with high-tech militaries, including China, Israel, Russia, and the United Kingdom, to move toward systems that would give full combat autonomy to machines.
“Killer robots are not self-willed ‘Terminator’-style robots, but computer-directed weapons systems that once launched can identify targets and attack them without further human involvement,” said roboticist Noel Sharkey, chair of the International Committee for Robot Arms Control. “Using such weapons against an adaptive enemy in unanticipated circumstances and in an unstructured environment would be a grave military error. Computer controlled devices can be hacked, jammed, spoofed, or can be simply fooled and misdirected by humans.”
The Campaign to Stop Killer Robots seeks to provide a coordinated civil society response to the multiple challenges that fully autonomous weapons pose to humanity. It is concerned about weapons that operate on their own without human supervision. The campaign seeks to prohibit taking a human out-of-the-loop with respect to targeting and attack decisions on the battlefield.
“The capability of fully autonomous weapons to choose and fire on targets on their own poses a fundamental challenge to the protection of civilians and to compliance with international law,” said Steve Goose, Arms Division director at Human Rights Watch. “Nations concerned with keeping a human in the decision-making loop should acknowledge that international rules on fully autonomous weapons systems are urgently needed and work to achieve them.”
The UN Special Rapporteur on extrajudicial, summary or arbitrary executions for the Office of the High Commissioner for Human Rights, Professor Christof Heyns, is due to deliver his report on lethal autonomous robotics to the second session of the Human Rights Council in Geneva, starting May 27, 2013. The report is expected to contain recommendations for government action on fully autonomous weapons.
“One key lesson learned from the Canadian led initiative to ban landmines was that we should not wait until there is a global crisis before taking action.” said Paul Hannon, Executive Director of Mines Action Canada. “The time to act on killer robots is now”
“We cannot afford to sleepwalk into an acceptance of these weapons. New military technologies tend to be put in action before the wider society can assess the implications, but public debate on such a change to warfare is crucial,” said Thomas Nash, Director of Article 36. “A pre-emptive ban on lethal autonomous robots is both necessary and achievable, but only if action is taken now.”
The Campaign to Stop Killer Robots believes that humans should not delegate the responsibility of making lethal decisions to machines. It has multiple moral, legal, technical, and policy concerns with the prospect of fully autonomous weapons, including:
- Autonomous robots would lack human judgment and the ability to understand context. These human qualities are necessary to make complex legal choices on a dynamic battlefield, to distinguish adequately between soldiers and civilians, and to evaluate the proportionality of an attack. As a result, fully autonomous weapons would not meet the requirements of the laws of war.
- The use of fully autonomous weapons would create an accountability gap as there is no clarity on who would be legally responsible for a robot’s actions: the commander, programmer, or one of the manufacturers of the many sensing, computing, and mechanical components? Without accountability, these parties would have less incentive to ensure robots did not endanger civilians and victims would be left unsatisfied that someone was punished for wrongful harm they experienced.
- If fully autonomous weapons are deployed, other nations may feel compelled to abandon policies of restraint, leading to a destabilizing robotic arms race. Agreement is needed now to establish controls on these weapons before investments, technological momentum, and new military doctrine make it difficult to change course.
- The proliferation of fully autonomous weapons could make resort to war and armed attacks more likely by reducing the possibility of military casualties.
The Campaign to Stop Killer Robots includes several non-governmental organizations (NGOs) associated with the successful efforts to ban landmines, cluster munitions, and blinding lasers. Its members collectively have a wide range of expertise in robotics and science, aid and development, human rights, humanitarian disarmament, international law and diplomacy, and the empowerment of women, children, and persons with disabilities. The campaign is building a worldwide network of civil society contacts in countries including Canada, Egypt, Japan, The Netherlands, New Zealand, Pakistan, United Kingdom, and the United States.
The Steering Committee is the principal leadership and decision-making body for of the Campaign to Stop Killer Robots and is comprised of nine NGOs: five international NGOs Human Rights Watch, International Committee for Robot Arms Control, Nobel Women’s Initiative, Pugwash Conferences on Science & World Affairs, and Women’s International League for Peace and Freedom, and four national NGOs Article 36 (UK), Association for Aid and Relief Japan, Mines Action Canada, and IKV Pax Christi (The Netherlands).
The Campaign to Stop Killer Robots was established by representatives of seven of these NGOs at a meeting in New York on 19 October 2012. It is an inclusive and diverse coalition open to NGOs, community groups, and professional associations that support the campaign’s call for a ban and are willing to undertake actions and activities in support of the campaign’s objectives. The campaign’s initial coordinator is Mary Wareham of Human Rights Watch.
On Monday, April 22, the Steering Committee of the Campaign to Stop Killer Robots convened a day-long conference for 60 representatives from 33 NGOs from ten countries to discuss the potential harm that fully autonomous weapons could pose to civilians and to strategize on actions that could be taken at the national, regional, and international levels to ban the weapons.
Contact information for the Campaign to Stop Killer Robots:
- Website – www.stopkillerrobots.org
- Facebook – http://www.facebook.com/#!/stopkillerrobots
- Twitter – @BanKillerRobots
- Flickr – http://www.flickr.com/people/stopkillerrobots
- YouTube – http://www.youtube.com/user/StopKillerRobots
To schedule a media interview (see list of spokespersons), please contact:
- UK media – Laura Boillot at Article 36, +44(0)7515-575-175, [email protected]
- International media – Kate Castenson at Human Rights Watch, +1 (646) 203-8292, [email protected]
Video Footage
- Raw interview footage of Williams, Sharkey, Goose, and Docherty: http://multimedia.hrw.org/distribute/hpgicavqly
- Playlist of precursors to fully autonomous weapons: /YQe4w8
For more information, see:
- Human Rights Watch “Losing Humanity” report on fully autonomous weapons: /UQscFA
- Human Rights Watch “Review of the New US Policy on Autonomy in Weapons Systems” briefing paper: /17FDTTj
List of Spokespersons
The following campaign spokespersons will be speaking at the launch events in London on 22-24 April and are available for interview on request. In addition, raw interview footage of Williams, Sharkey, Goose, and Docherty is available here: http://multimedia.hrw.org/distribute/hpgicavqly
Principal Spokespersons
Ms. Jody Williams – Nobel Women’s Initiative, @JodyWilliams97 @NobelWomen
Jody Williams received the Nobel Peace Prize in 1997 for her work to ban landmines through the International Campaign to Ban Landmines, which shared the Peace Prize. In January 2006, Jody established the Nobel Women’s Initiative together with five of her sister Nobel Peace laureates. In an April 2011 article for the International Journal of Intelligence Ethics, Nobel Peace Laureate Jody Williams calls for a ban on “fully autonomous attack and kill robotic weapons.” In March 2013, the University of California Press published a memoir on her work entitled My Name is Jody Williams: A Vermont Girl’s Winding Path to the Nobel Peace Prize. Williams can speak on why civil society is coming together and partnering with other actors to pursue a pre-emptive ban on fully autonomous weapons. Longer biography available here: /JKVvBd
Prof. Noel Sharkey – International Committee for Robot Arms Control, @StopTheRobotWar
Roboticist Noel Sharkey is Professor of Artificial Intelligence and Robotics and Professor of Public Engagement at the University of Sheffield. He is co-founder and chair of the International Committee for Robot Arms Control (ICRAC), a group of experts concerned with the pressing dangers that military robots pose to peace and international security. Sharkey can speak on the technology that the campaign is seeking to prohibit and its ethical implications. See also: /9fJQ7j
Mr. Steve Goose – Human Rights Watch, @hrw
Steve Goose is executive director of the Arms Division of Human Rights Watch and chair of the International Campaign to Ban Landmines and Cluster Munition Coalition (ICBL-CMC). Goose and Human Rights Watch were instrumental in bringing about the 2008 Convention on Cluster Munitions, the 1997 international treaty banning antipersonnel mines, the 1995 protocol banning blinding lasers, and the 2003 protocol on explosive remnants of war. Goose can speak on why a ban on fully autonomous weapons is necessary and achievable, and explain current US policy and practice. See also: /USEBZo
Mr. Thomas Nash – Article 36, @nashthomas @article36
Thomas Nash is director of Article 36 and joint coordinator of the International Network on Explosive Weapons. As Coordinator of the Cluster Munition Coalition from 2004 to 2011, Nash led the global civil society efforts to secure the Convention on Cluster Munitions. Nash can speak about civil society expectations of UK policy, practice, and diplomacy on fully autonomous weapons.
Ms. Mary Wareham – Human Rights Watch, @marywareham, @hrw
Mary Wareham is advocacy director of the Arms Division of Human Rights Watch and initial coordinator of the Campaign to Stop Killer Robots. She worked on the processes that created the Convention on Cluster Munitions and the Mine Ban Treaty, and has worked to ensure their universalization and implementation. Wareham can speak about the new Campaign to Stop Killer Robots and its initial plans.
Technical Experts
Dr. Jürgen Altmann – International Committee for Robot Arms Control
Jürgen Altmann is co-founder and vice-chair of the International Committee for Robot Arms Control. He is a physicist and peace researcher at Dortmund Technical University in Germany. Altmann has studied preventive arms control of new military technologies and new methods for the verification of disarmament agreements. He can speak about Germany’s policy and practice on fully autonomous weapons.
Dr. Peter Asaro – International Committee for Robot Arms Control, @peterasaro
Peter Asaro is co-founder and vice-chair of the International Committee for Robot Arms Control. He is a philosopher of technology who has worked in Artificial Intelligence, neural networks, natural language processing and robot vision research. Asaro is director of Graduate Programs for the School of Media Studies at The New School for Public Engagement in New York City. See also: /73JqBw
Ms. Bonnie Docherty – Human Rights Watch, @hrw
Bonnie Docherty is senior researcher in the Arms Division at Human Rights Watch and also a lecturer on law and senior clinical instructor at the International Human Rights Clinic at Harvard Law School. She has played an active role, as both lawyer and field researcher, in the campaign against cluster munitions. Docherty’s report Losing Humanity: The Case against Killer Robots outlines how fully autonomous weapons could violate the laws of war and undermine fundamental protections for civilians. See also: /103PV4t
Mr. Richard Moyes – Article 36, @rjmoyes @article36
Richard Moyes is a managing partner at Article 36 and an honorary fellow at the University of Exeter. He was previously director of policy at Action on Armed Violence (formerly Landmine Action) and served as co-chair of the Cluster Munition Coalition. Moyes can speak about civil society expectations of UK policy, practice, and diplomacy on fully autonomous weapons. See also: /103SAuS
Steering Committee members
Human Rights Watch, www.hrw.org
Human Rights Watch is serving as initial coordinator of the Campaign to Stop Killer Robots. Over the past two decades, the Arms Division of Human Rights Watch has been instrumental in enhancing protections for civilians affected by conflict, leading the International Campaign to Ban Landmines that resulted in the 1997 Mine Ban Treaty and the Cluster Munition Coalition, which spurred the 2008 Convention on Cluster Munitions. It also led the effort that resulted in the pre-emptive prohibition on blinding laser weapons in 1995. In November 2012, Human Rights Watch and Harvard Law School’s International Human Rights Clinic launched the report Losing Humanity: The Case against Killer Robots, the first in-depth report by a non-governmental organization on the challenges posed by fully autonomous weapons.
Article 36 (UK), www.article36.org
Article 36 is a UK-based not-for-profit organization working to prevent the unintended, unnecessary or unacceptable harm caused by certain weapons. It undertakes research, policy and advocacy and promotes civil society partnerships to respond to harm caused by existing weapons and to build a stronger framework to prevent harm as weapons are used or developed in the future. In March 2012, Article 36 called for a ban on military systems that are able to select and attack targets autonomously.
Association for Aid and Relief Japan, www.aarjapan.gr.jp
Association for Aid and Relief, Japan is an international non-governmental organization founded in Japan in 1979. As a committed member of the International Campaign to Ban Landmines, Association for Aid and Relief, Japan played a central role in convincing Japan to ban antipersonnel landmines and join the 1997 Mine Ban Treaty.
IKV Pax Christi (The Netherlands)- www.ikvpaxchristi.nl
IKV Pax Christi is a peace organization based in the Netherlands. It works with local partners in conflict areas and seeks political solutions to crises and armed conflicts. In May 2011, Dutch NGO IKV Pax Christi published a report entitled Does Unmanned Make Unacceptable? Exploring the Debate on using Drones and Robots in Warfare.
International Committee for Robot Arms Control, http://icrac.net
The International Committee for Robot Arms Control (ICRAC) is a not-for-profit organization comprised of scientists, ethicists, lawyers, roboticists, and other experts. It works to address the potential dangers involved with the development of armed military robots and autonomous weapons. Given the rapid pace of development of military robots and the pressing dangers their use poses to peace, international security, the rule of law, and to civilians, ICRAC supports a ban on armed robots with autonomous targeting capability.
Mines Action Canada, www.minesactioncanada.org
Mines Action Canada is a coalition of over 35 Canadian non-governmental organizations working in mine action, peace, development, labour, health and human rights that came together in 1994. It is the Canadian partner of the International Campaign to Ban Landmines and a founding member of the Cluster Munition Coalition.
Nobel Women’s Initiative, nobelwomensinitiative.org
The Nobel Women’s Initiative was established in January 2006 by 1997 Nobel Peace Laureate and five of her sister Nobel Peace laureates. The Nobel Women’s Initiative uses the prestige of the Nobel Peace Prize and of courageous women peace laureates to magnify the power and visibility of women working in countries around the world for peace, justice and equality. In an April 2011 article for the International Journal of Intelligence Ethics, Nobel Peace Laureate Jody Williams calls for a ban on “fully autonomous attack and kill robotic weapons.”
Pugwash Conferences on Science & World Affairs, www.pugwash.org
A central main objective of Pugwash is the elimination of all weapons of mass destruction (nuclear, chemical and biological) and of war as a social institution to settle international disputes. To that extent, peaceful resolution of conflicts through dialogue and mutual understanding is an essential part of Pugwash activities, that is particularly relevant when and where nuclear weapons and other weapons of mass destruction are deployed or could be used.
Women’s International League for Peace and Freedom www.wilpf.org
The Women’s International League for Peace and Freedom (WILPF) is the oldest women’s peace organization in the world. Its aims and principles include working toward world peace; total and universal disarmament; the abolition of violence and coercion in the settlement of conflict and their substitution in every case of negotiation and conciliation; the strengthening of the United Nations system; the continuous development and implementation of international law; political and social equality and economic equity; co-operation among all people; and an environmentally sustainable development.
# # #
Media Advisory
Press Briefing to mark the campaign’s launch
When
Tuesday, April 23, 2013
10.30 a.m.-11.30a.m.
Where
The Frontline Club, 13 Norfolk Place, London W2 1QJ
(Closest tube stations: Paddington, Edgware Road and Lancaster Gate)
Press Briefing
At the press briefing, the campaign’s founders will outline their concerns with fully autonomous weapons, also known as “killer robots.” The new global coalition seeks a ban on these weapons that would be able to select targets and use lethal force without human intervention. The campaign’s call to action, composition, and initial activities will be explained and all your questions answered…
Speakers
- Ms. Jody Williams, 1997 Nobel Peace Laureate and Chair of the Nobel Women’s Initiative
- Prof. Noel Sharkey, Chair of the International Committee on Robot Arms Control
- Steve Goose, Director of the Arms Division of Human Rights Watch
Please see the biographies of these and other campaign spokespersons.
Contact
To confirm your participation or to schedule an interview, please contact:
- UK media – Laura Boillot, Article 36, +44(0)7515-575-175, [email protected]
- International media – Kate Castenson, Human Rights Watch, +1 (202) 612-4351 or +1-646-203-8292 (mobile), [email protected]
- Canadian media – Erin Hunt, +1 (613) 241-3777, [email protected] or in London, Paul Hannon, +1 (613) 951-5430 (mobile) or [email protected]
Video Footage
- Raw interview footage of Williams, Sharkey, Goose, and Docherty: http://multimedia.hrw.org/distribute/hpgicavqly
- Playlist of precursors to fully autonomous weapons: /YQe4w8
Social media information
- Facebook – http://www.facebook.com/#!/stopkillerrobots
- Twitter – @BanKillerRobots
- Flickr – http://www.flickr.com/people/stopkillerrobots
- YouTube – http://www.youtube.com/user/StopKillerRobots
For more information
- Human Rights Watch, “Losing Humanity” report: /UQscFA
Campaign Launch April 23
Our campaign launch will be held in London, England, on 22-23 April 2013 including a press conference on Tuesday, 23 April.
Check back then for more details about this exciting new campaign!