Autonomous weapons. They sound threatening, but these two words can’t possibly encapsulate the grim reality that they would bring. Unmanned, armed machines that can search for and eliminate people based on a couple of preprogrammed criteria. Humans wouldn’t even be needed to pull a trigger.
The prospect of this cold, clinical future has spurred over 1,000 high profile artificial intelligence (AI) experts and leading researchers to sign an open letter calling for a ban on “offensive autonomous weapons.” The open letter can be read (and signed) here. It was presented at the International Joint Conference on Artificial Intelligence in Buenos Aires.
The signatories include Tesla’s Elon Musk, Professor Stephen Hawking, Google DeepMind‘s chief executive Demis Hassabis and Apple cofounder Steve Wozniak. With AI on the cusp of creation, it’s imperative that it is used for the benefit of humanity as opposed to destruction. Otherwise, mankind could find that its creations are beyond its control.
Wired also spotted, among the signatures, a familiar name: Sarah Connor, of Terminator fame. If anyone was going to try to stop the autonomous weapons from rising up, then you can guarantee that her name would feature somewhere.
The letter states that “autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms,” suggesting their development could prompt an arms race similar to the Cold War.
“The endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow,” the letter said. “The key question for humanity today is whether to start a global AI arms race or to prevent it from starting.”
The reality of autonomous weapons could be terrifying. Robots, armed with weapons, are given criteria to decide whether a living thing has the right to live based on some coding. The cost of mistakes is life or death, and it’s not clear who would be accountable for the machines.
While autonomous weapons could make warzones safer for soldiers and military personnel, their potential to devastate human life has enough experts worried to warrant significant discussion on the topic.