Main

February 27, 2008

Skynet update

Sarah Connor and her son would agree with this fellow.

Increasingly autonomous, gun-toting robots developed for warfare could easily fall into the hands of terrorists and may one day unleash a robot arms race, a top expert on artificial intelligence told AFP.

"They pose a threat to humanity," said University of Sheffield professor Noel Sharkey ahead of a keynote address Wednesday before Britain's Royal United Services Institute.

Intelligent machines deployed on battlefields around the world -- from mobile grenade launchers to rocket-firing drones -- can already identify and lock onto targets without human help.

But up to now, a human hand has always been required to push the button or pull the trigger.

If we are not careful, [Sharkey] said, that could change.

Military leaders "are quite clear that they want autonomous robots as soon as possible, because they are more cost-effective and give a risk-free war," he said.

Washington plans to spend four billion dollars by 2010 on unmanned technology systems, with total spending expected rise to 24 billion, according to the Department of Defense's Unmanned Systems Roadmap 2007-2032, released in December.

James Canton, an expert on technology innovation and CEO of the Institute for Global Futures, predicts that deployment within a decade of detachments that will include 150 soldiers and 2,000 robots.

The use of such devices by terrorists should be a serious concern, said Sharkey.

Captured robots would not be difficult to reverse engineer, and could easily replace suicide bombers as the weapon-of-choice. "I don't know why that has not happened already," he said.

But even more worrisome, he continued, is the subtle progression from the semi-autonomous military robots deployed today to fully independent killing machines.

"I have worked in artificial intelligence for decades, and the idea of a robot making decisions about human termination terrifies me," Sharkey said.

Ronald Arkin of Georgia Institute of Technology, who has worked closely with the US military on robotics, agrees that the shift towards autonomy will be gradual.

But he is not convinced that robots don't have a place on the front line.

"Robotics systems may have the potential to out-perform humans from a perspective of the laws of war and the rules of engagement," he told a conference on technology in warfare at Stanford University last month.

The sensors of intelligent machines, he argued, may ultimately be better equipped to understand an environment and to process information. "And there are no emotions that can cloud judgement, such as anger," he added.

Nor is there any inherent right to self-defence.

For now, however, there remain several barriers to the creation and deployment of Terminator-like killing machines.

But even if technical barriers are overcome, the prospect of armies increasingly dependent on remotely-controlled or autonomous robots raises a host of ethical issues that have barely been addressed.

You think?

Fox has been running a documentary new sci-fi series Monday nights called The Sarah Connor Chronicles that touches on the potential pitfalls that come with fielding autonomous killing machines.

But that's just a silly TV show, right?

Isn't it?

Posted by Mike Lief at February 27, 2008 06:27 AM | TrackBack

Comments

Post a comment










Remember personal info?