Guest Post: The Importance of a Ban on Killer Robots for an International Affairs Student

by Brett MacFarlane

When I first applied for an internship position to work on the Campaign to Stop Killer Robots back in November, I knew virtually nothing on either the campaign or the killer robots issue. I chose the internship with Mines Action Canada as my top choice because it was the position which most closely related to my field of study: Conflict Analysis and Conflict Resolution. When submitting my application, I had a conversation with my fellow students on just what exactly were killer robots. The general consensus of the group was that killer robots had to be drones that were being militarily used in such countries as Pakistan and Yemen.

Since joining the International Campaign to Stop Killer Robots in January, I have had the privilege of being exposed to a new issue that has not been discussed by the general public or even most international affairs students. I learned about current development efforts by militaries to create robotic weapons which would have complete autonomy to choose whether or not to fire on a specified target without meaningful human control. Most disturbingly I learned that some countries (e.g. the United States, Israel, and several others) have not only taken steps to develop “human-out-of-the-loop weapons”, but that some current technologies could easily be adapted to become autonomous weapons. As a student studying in an international affairs program and as a concerned person, this issue raises human rights and humanitarian concerns.

The use of autonomous weapons is a troubling issue for human rights advocates and humanitarian organizations because it would make humans increasingly vulnerable in warfare where international law is not designed to accommodate autonomous weapons. First, how could the protection of civilians be guaranteed in times of combat? If human judgment is taken out of the battlefield, robots would be tasked with distinguishing armed combatants from ordinary citizens. In this scenario, would a robot have the capability to differentiate between a soldier holding a weapon from a child holding a toy gun? The potential to have such mistakes be made is likely to occur so long as robots are given higher autonomy and decision-making capabilities on the battlefield. Further, the development and use of autonomous weapons could pose serious issues of accountability in war. For example, if a robotic system was to go awry and end up massacring a village of non-combatants, who would be held accountable? Would it be the systems operator of the machine, the military, the computer programmer, or the manufacturer of the machine? Without military troops in the air, land, or sea, who can be held liable for the actions of robots in combat? Implementing the use of autonomous robots in war would severely reduce the legal protections civilians are accorded during conflict.

I am very concerned that putting autonomous weapons on the battlefield would change how wars are fought and conducted. Wars would no longer be fought by the military personnel of two opposing sides; but by autonomous weapons, capable of making their own ‘kill decision’, against human forces. Countries which have the financial means to develop autonomous weapons could threaten lesser developed countries who would bear the costs of higher human casualties on the battlefield. More importantly, the potential for an increase in future conflict will grow as the decision to enter into combat would be much easier for leaders to make as they would not have to bear the costs of human casualties. The concern here is that countries would be sending machines to fight against humans, instead of the traditional model of human versus human. As difficult as this may be to hear, it is only through the casualties of soldiers on the battlefield that we are able to see the true cost of warfare. Taking human sacrifice out of the battlefield could potentially cause an increase in future warfare.

As interest in the topic of killer robots in the international community grows, it is pertinent that students, and indeed all citizens, begin to discuss the development of autonomous robots for military use in their respective fields. Should silence continue not only in the academic community, but in the Canadian parliament and public domain, the potential for autonomous robots to make life and death decisions on the battlefield without human control may be realized. As one concerned student, and citizen, who has signed the petition to Keep Killer Robots Fiction, I strongly encourage all to Keep Killer Robots Fiction by not only gaining exposure and increasing their knowledge on the subject, but to join me in signing the petition at /KRpetition. Only through increased discussion and knowledge of this topic in the general community can pressure be mounted on governments to create a pre-emptive ban on this emerging threat.

Brett MacFarlane interned at Mines Action Canada and is a Master of the Arts Candidate at the Norman Paterson School of International Affairs at Carleton University specializing in Conflict Analysis and Conflict Resolution. 

Posted on April 23, 2014, in Campaign and tagged , , , , , , . Bookmark the permalink. Leave a comment.

Leave a Reply

Fill in your details below or click an icon to log in:

You are commenting using your WordPress.com account. Log Out / Change )

You are commenting using your Twitter account. Log Out / Change )

You are commenting using your Facebook account. Log Out / Change )

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 28 other followers

%d bloggers like this: