BBC NEWS. Killer robots: Tech experts warn against AI arms race. Published 29 July 2015.
More than 1,000 tech experts, scientists and researchers have written a letter warning about the dangers of autonomous weapons.
In the latest outcry over “killer robots”, the letter warns that “a military AI [artificial intelligence] arms race is a bad idea”.
Among the signatories are scientist Stephen Hawking, entrepreneur Elon Musk and Apple co-founder Steve Wozniak.
The letter will be presented at an international AI conference today.
“Killer robots” are currently the subject of much debate and have recently been discussed by committees at the United Nations, which is considering the potential for a ban on certain types of autonomous weapons.
Now, the experts have called for a specific ban on the use of artificial intelligence to manage weapons that would be “beyond meaningful human control”.
“Just as most chemists and biologists have no interest in building chemical or biological weapons, most AI researchers have no interest in building AI weapons – and do not want others to tarnish their field by doing so,” they add.
MIT professor Noam Chomsky, Google AI chief Demis Hassabis, and consciousness expert Daniel Dennett are among others to have endorsed the letter.
The text, which has been published online by the Future of Life Institute (FLI), will be presented to delegates of the International Joint Conference on Artificial Intelligence in Buenos Aires.
AMA on AI
Prof Hawking, a signatory to the letter, is currently taking part in an Ask Me Anything (AMA) session on Reddit, in which he is collecting questions about “making the future of technology more human”.
He will respond to selected questions throughout the week, but has not yet posted his first reply.
In December, in an exclusive interview with the BBC, the professor raised his concern that AI could spell the end of mankind.
“Humans, who are limited by slow biological evolution, couldn’t compete [with artificial intelligence], and would be superseded,” he said.
But Eric Horvitz – a Microsoft Research chief who signed the autonomous weapons letter – has also appeared in a video posted online by his firm’s PR team in which he defends other AI research.
“You look at how much computation has done for our society, for socio-economics, in applications like healthcare – it’s been incredible. AI will change so many things,” he said.
“With that comes a lot of hope, a lot of possible benefits and also some concerns.
“Will the machines become so powerful and smart that they can’t be turned off and they come to outwit man?
“I think there are very interesting questions that need to be solved along the way, but I expect largely positive beneficial results coming out of this research largely because we guide it.”
In the loop
The UK’s Ministry of Defence said in a statement that all UK forces currently operate, “in accordance with International Humanitarian Law.”
“The UK’s clearly defined Rules of Engagement are formulated on this basis.
“As such, there is always a ‘man in the loop’ controlling the system.
“UK military personnel are and will always be involved in the decision to employ and in the act of releasing weapons,” it said.
A spokesman for BAE System, the UK’s biggest defence contractor, added: “We are designing systems that will always be required to comply with the rules of engagement and legal and regulatory requirements. We support the UK MoD stance that there should be military personnel involved in the decision to employ weapons.”