About the issue

Killer robots have been a staple trope in fiction and entertainment for years.  Over the past decade, the possibility of fully autonomous weapons is becoming closer to reality. Recently we have seen a dramatic rise in unmanned weapons that has changed the face of warfare. New technology is permitting serious efforts to develop fully autonomous weapons. These robotic weapons would be able to choose and fire on targets on their own, without any human intervention. This capability would pose a fundamental challenge to the protection of civilians and to compliance with international human rights and humanitarian law.  For clarity is necessary to note that fully autonomous weapons are not drones; drones have a human pilot in a remote location.  Fully autonomous weapons are a large step beyond armed drones.

Although there is some debate about how soon fully autonomous weapons could be available, we do know that some high-tech militaries, including China, Israel, Russia, the United Kingdom, and the United States, are moving toward technology that would give more autonomy to machines in combat. If a  state deploys fully autonomous weapons,others may feel compelled to keep up with their own weaponry, leading to a robotic arms race. Action is needed now to establish controls on these weapons before investments, technological momentum, and new military doctrine make it difficult to change course.

Allowing life or death decisions to be made by machines crosses a fundamental moral line. Autonomous robots would lack human judgment and the ability to understand context. These qualities are necessary to make complex ethical choices on a dynamic battlefield, to distinguish adequately between soldiers and civilians, and to evaluate the proportionality of an attack.  As a result fully autonomous weapons would not meet the requirements of the laws of war.

Replacing human troops with machines could make the decision to go to war easier, which would shift the burden of armed conflict further onto civilians. The use of fully autonomous weapons would create an accountability gap as there is no clarity on who would be legally responsible for a robot’s actions: the commander, programmer, manufacturer, or robot itself? Without accountability, these parties would have less incentive to ensure robots did not endanger civilians and victims would be left unsatisfied that someone was punished for the harm they experienced.

Leave a Reply

Fill in your details below or click an icon to log in:

You are commenting using your WordPress.com account. Log Out / Change )

You are commenting using your Twitter account. Log Out / Change )

You are commenting using your Facebook account. Log Out / Change )

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 28 other followers

%d bloggers like this: