The increasing deployment of gun-toting robots by the U.S. military and other armed forces around the world could end up endangering civilian lives and giving terrorists new ideas, warns a U.K. robotics professor.
The U.S. Department of Defense (DoD) has outlined plans to ramp up the use of remotely controlled robotic vehicles on land, undersea and in the air. The goal is to field increasingly autonomous robots—without a human controller—to dispose of explosives, stand guard and spot targets to attack. Nations such as South Korea and the Republic of South Africa have also begun adopting armed robotic systems.
The prospect of armed, autonomous robots is enough to rattle Noel Sharkey, professor of computer science at the University of Sheffield, England. "One of the fundamental laws of war is being able to discriminate real combatants and noncombatants," he says. "I can see no way that autonomous robots can deliver this for us." Even today's unmanned air and ground vehicles could do harm, he cautions, by teaching insurgents new ways to mount devastating attacks from a safe distance.
But aside from the technological challenges to developing autonomous weapons, it remains unclear how quickly military brass would adopt a high-tech approach that takes soldiers out of the equation or if terrorists would be interested in weapons that might not inflict as many casualties as traditional attacks.
Congress in 2001 mandated that one third of military ground vehicles must be unmanned by 2015. According to the DoD report, "Unmanned Systems Roadmap 2007-2032," the Pentagon plans to spend $4 billion by 2010 on unmanned systems technology, with an eye toward increasing autonomy to free up troops that would otherwise have to monitor the robots closely.
An autonomous Chevy Tahoe successfully navigated a 60-mile (96-kilometer) urban setting this past November in four hours to win the Defense Advanced Research Projects Agency's 2007 Urban Challenge. "It is quite realistic," Sharkey says, "to have autonomous vehicles that are not monitored—to take supplies and navigate from place to place."
Giving them license to kill would be another matter. Despite decades of research in the field of artificial intelligence (AI), computers remain unable to make simple visual discriminations such as picking a cow out of a barnyard scene. Robotic systems would be hard-pressed to tell friend from foe even in ideal conditions, let alone amid the smoke and confusion of battle, Sharkey says.
Mindful of these limitations, Sharkey, who moonlights as a judge in televised robot contests such as BBC Two's Robot Wars series, proposes a global ban on autonomous weapons until they can comply with international rules of war prohibiting the use of force against noncombatants.
The U.S. armed forces currently have more than 4,000 robots deployed in Iraq and Afghanistan. Unmanned aerial vehicles (UAV) such as Predator air drones, used for reconnaissance and missile strikes, along with smaller, hand-held fliers have logged more than 400,000 hours of flight time, according to the DoD report.
The report says that robotic ground vehicles, including the Talon, a miniature treaded tank in operation since 2000, have disposed of thousands of improvised explosive devices (IED) in Iraq and Afghanistan. Foster–Miller, Inc., the Waltham, Mass.–based manufacturer of the Talon, says that three armed units, called Talon–SWORDS (for special weapons observation reconnaissance detection systems), rolled into Iraq equipped with machine guns last year.
Sharkey's fear is that unmanned technology could fall into the wrong hands. Hezbollah, the Lebanese Shiite Islamic paramilitary group, reportedly flew UAVs (likely supplied by Iran) across the Israeli border in 2004 and 2006. He says it would be relatively easy for terrorist or insurgent groups to mount explosives on remote-controlled cars or airplanes.
Dennis Gormley, a senior fellow at The James Martin Center for Nonproliferation Studies's Washington, D.C., office and a specialist in missile systems, says that UAV technology in particular raises ethical issues, because it gives combat planners the opportunity to make lightning-quick strikes that were previously impossible. In one well-known example, the Central Intelligence Agency in 2002 killed six suspected al Qaeda members in Yemen by launching Hellfire missiles at their car from a hovering Predator.
But Gormley says that worries over robots-run-amok ignore the realities of military and terrorist decision making. He notes that Air Force officials in particular tend to drag their heels on technologies that might make their pilots appear obsolete.
He says that would-be terrorists could potentially deliver up to several hundred pounds of explosives by converting a build-it-yourself airplane into a UAV, but adds that the conversion would require several years of technically challenging work. "Frankly," he says, "I think that's beyond the capacity of any terrorist group."