Should the technology ever progress to that point, however, one thing does give McCarthy pause. “It falls under the same heading as biological and nuclear weapons. There are enough people out there with very poor judgment and a lot of anger. As technology gets easier to access — as the tools for making things become cheaper and more plentiful — we may find that crazy people have a larger capability to write their craziness onto the world. And that worries me.”
VIDEO: The BigDog project
Beware of bigdog
Sometimes even our own weapons can alarm the hell out of us. Boston Dynamics, the Waltham-based robotics and engineering firm, released a video clip two months ago showing the latest work on its BigDog project, a walking quadruped with stereo vision that’s been in the works since 2005, funded by the military’s Defense Advanced Research Projects Agency (DARPA).
It is, in a word, fuckingterrifying.
The Defense Department devised the thing as a pack mule of sorts, meant to accompany light-traveling soldiers on wheel-unfriendly terrain. But log on to YouTube and watch as its spindly but powerful limbs carry its bulk fleet-footedly through forest underbrush. And try not to shudder.
It’s not just the deeply creepy buzzing of its engine, like the threnody of a thousand angry bees. It’s the human fluidity with which its limbs move — as if two black-clad members of Mummenschanz were hidden under its carapace. More remarkable is its seeming indestructibility. A researcher kicks it with all his might: it wobbles drunkenly, nearly falls, but then rights itself and keeps moving. On slick ice, its legs splay out like a newborn colt, but it always manages to find its footing.
It may be meant for cargo carrying; still, one can’t help but picture this crazy, lifelike thing with guns strapped to its side. In short, the BigDog makes the exploring and ordnance-disposing PackBot — another Massachusetts-made military machine, invented at Burlington’s iRobot — seem R2-D2-friendly by comparison.
When people fantasize of robots ruling the planet, as they long have, this is usually how it starts: more-or-less humanoid mechanized creatures, usually weaponized, destroying their hubristic creators en masse.
Could it actually happen? The fact that several experts are up in arms about an irresponsible “robot arms race” they feel could get out of control suggests it’s at least possible.
This past February, in advance of a London conference on “The Ethics of Autonomous Military Systems,” Noel Sharkey, an AI and robotics professor at the University of Sheffield, told New Scientist magazine that he was “really scared” by how many nations — not least the US — are developing military robots that could eventually kill without any human guidance whatsoever.
There are currently more than 4000 semi-autonomous robots in Iraq alone, and Sharkey said giving machines like them the power to decide when to “pull the trigger” isn’t too bright. The Pentagon, New Scientist reported, is nonetheless “nearly two years into a research program aimed at having robots identify potential threats without human help.”