Skip to main content
Part of complete coverage on

Ban the killer robots before it's too late

By Noel Sharkey, Special to CNN
April 3, 2013 -- Updated 1509 GMT (2309 HKT)
A U.S. Predator unmanned drone armed with a missile on the tarmac at Kandahar military airport in June 2010.
A U.S. Predator unmanned drone armed with a missile on the tarmac at Kandahar military airport in June 2010.
STORY HIGHLIGHTS
  • UK robotics professor leading calls for a worldwide ban on autonomous weapons
  • We can't rely on robots to conform to international law, says Noel Sharkey
  • Sharkey is chairman of and NGO leading a campaign to "Stop Killer Robots"
  • Autonomous robots could destabilize world security and trigger unintentional wars

Editor's note: Noel Sharkey is Professor of Artificial Intelligence and Robotics and Professor of Public Engagement at the UK's University of Sheffield and Chairman of the International Committee for Robot Arms Control.

(CNN) -- As wars become increasingly automated, we must ask ourselves how far we want to delegate responsibility to machines. Where do we want to draw the line?

Weapons systems have been evolving for millennia and there have always been attempts to resist them. But does that mean that we should just sit back and accept our fate and hand over the ultimate responsibility for killing to machines?

Over the last few months there has been an increasing debate about the use of fully autonomous robot weapons: armed robots that once launched can select their own targets and kill them without further human intervention.

"We just shouldn't grant machines the decision about who lives or dies
Noel Sharkey, Chairman of ICRAC
Professor Noel Sharkey
Professor Noel Sharkey

Some have argued that robots could be more accurate on the battlefield than human soldiers and save more civilian lives. But this is speculation based on assumptions about future developments of computer hardware and software. It is no more than "hopeware" -- since the 1950s, Artificial Intelligence has moved at a snail's pace compared to what proponents have predicted.

Others argue that even if robots could be more accurate under some restricted circumstances at some unknown time in the future, we just shouldn't grant machines the decision about who lives or dies.

At this point, we cannot rely on machines having the independent facility to conform to international law. Current sensing systems are not up to the task. And even if machines had adequate sensing mechanisms they would still be missing the vital components of battlefield awareness and common sense reasoning to make decisions about who and when it is appropriate to kill.

Robots do not have the agency to decide if striking a target is proportional to the expected military advantage. There is no metric for this. Much of war is art and not science. A military commander must make a qualitative decision about the number of civilian lives that can be risked for a particular military objective. And that commander can be held accountable.

A robot doesn't have the moral agency to be held accountable. Some would argue that the commander who sends a robot on a mission would be responsible (last point of contact). But that could be unfair since it could be the fault of the mission programmer, the manufacturer or one of dozens of little companies providing components. Maybe it should be the senior staff or policy makers who had the idea to use robots. Or the device could have been tampered with in the industrial supply chain or even damaged in action. Forensics are extremely difficult with such complex devices.

"Is anyone considering how autonomous weapons could destabilize world security and trigger unintentional wars?
Noel Sharkey, Chairman of ICRAC

Yet a recent U.S. DoD directive (November 2012) gives a green light to research and development of autonomous weapons systems while presenting a cautious route to their deployment.

This is borne from a culmination of U.S. military road maps dating back to 2002 and it is a bad move. It sends the wrong message to other nations. As the most militarily advanced nation on the planet, the U.S. has the opportunity to take the take the lead in halting these developments.

Thanks to the U.S.'s use of drones, more than 70 other countries have acquired the technology in a new arms race. It is simply blinkered to think that they will not follow suit with autonomous weapons. Is anyone thinking about how an adaptive enemy will exploit the weaknesses of robot weapons with spoofing, hacking or misdirection?

Is anyone considering how unknown computer programs will interact when swarms of robots meet? Is anyone considering how autonomous weapons could destabilize world security and trigger unintentional wars?

In April this year in London, a group of prominent NGOs will launch a large civil society campaign to "Stop Killer Robots." They are seeking a new legally binding preemptive international treaty to prohibit the development and deployment of fully autonomous robot weapons.

The aim is to stop these weapons getting into the arsenals of the world's militaries while there is still an opportunity. Once there has been large national investments in the technology, it may be too late.

Do you think autonomous robot weapons should be outlawed? Leave your comments below.

The opinions expressed in this commentary are solely those of Noel Sharkey.

ADVERTISEMENT
Part of complete coverage on
March 26, 2014 -- Updated 1052 GMT (1852 HKT)
Prototype of Identilock biometric system, developed by Detroit start-up Sentinl
Biometric technology is being used to create guns that identify a user from their fingerprints and from their grip.
March 14, 2014 -- Updated 2018 GMT (0418 HKT)
Tim Berners-Lee, the man credited with inventing the world wide web, gives a speech on April 18, 2012 in Lyon, central France, during the World Wide Web 2012 international conference on April 18, 2012 in Lyon.
As the World Wide Web turns 25, its creator talks about spying, censorship and freedom.
europe close submarine cable map 2014
This incredible map reveals the sprawling network of the underwater Internet.
February 19, 2014 -- Updated 1902 GMT (0302 HKT)
Cassiopeia A was a star more than eight times the mass of our sun before it exploded in the cataclysmic, fiery death astronomers call a supernova.
February 13, 2014 -- Updated 1550 GMT (2350 HKT)
U.S. scientists say they've produced more energy from a fusion reaction in their fuel source than they put into the fuel, a research milestone.
February 13, 2014 -- Updated 2333 GMT (0733 HKT)
Robots that behave like termites could be useful for construction projects that would be too dangerous for humans.
February 13, 2014 -- Updated 1902 GMT (0302 HKT)
Researchers used DNA to track historical mixings of populations during more than 4,000 years in a new study.
January 24, 2014 -- Updated 1658 GMT (0058 HKT)
A lively community of startups, hobbyists and photographers have embraced DIY drone technology.
November 5, 2013 -- Updated 1426 GMT (2226 HKT)
Massive new atom smasher goes in search of the secrets of the Universe
October 17, 2013 -- Updated 1402 GMT (2202 HKT)
Barchan dunes
In recent years we've discovered some of the strangest things on the Red Planet like ice spiders and spiral-shaped lava tubes.
October 3, 2013 -- Updated 1047 GMT (1847 HKT)
Nanotechnology
It is an age-old question: will humankind ever defeat old age?
ADVERTISEMENT