Police cars on Main Street in Dallas following the sniper shooting during a protest on Thursday. Laura Buckman/AFP/Getty Images hide caption
toggle caption Laura Buckman/AFP/Getty Images
After sniper fire struck 12 police officers at a rally in downtown Dallas, killing five, police cornered a single suspect in a parking garage. After a prolonged exchange of gunfire and a five-hour-long standoff, police made what experts say was an unprecedented decision: to send in a police robot, jury-rigged with a bomb.
“We saw no other option but to use our bomb robot and place a device on its extension for it to detonate where the suspect was,” Dallas Police Chief David Brown told a news conference Friday. “Other options would have exposed our officers to grave danger. The suspect is deceased as a result of detonating the bomb.”
At a Friday evening press conference, Dallas Mayor Mike Rawlings revealed that police used a common plastic explosive known as C4.
“The same automated robot equipment used to defuse bombs was used to place C4 in place and to detonate that,” Rawlings said.
“This was a man that we gave plenty of options to to give himself up peacefully and we spent a lot of time talking. He had a choice to come out and we would not harm, or stay in and we would. He picked the latter.”
Robots have been part of police tactical equipment for years — used to surveil crime scenes, aide in hostage negotiations or defuse bombs — but this was a “unique use of equipment,” according to Chuck Canterbury, national president of the Fraternal Order of Police, the largest U.S. law enforcement union.
“I think it’s the first time that’s been utilized,” Canterbury told NPR. “I know that SWAT teams around the country have been training for that scenario, especially with terroristic-type threats, where you know that the offenders do not plan to live through them.”
This, in fact, wasn’t the first time a police robot was rigged to do something it wasn’t originally designed to do — say, instead of defusing a bomb, to deliver a flash or smoke grenade to incapacitate a suspect, experts say. But it was apparently the first purposeful killing of a suspect using such a rig.
“Given how many police [departments] have robots and given how versatile they are and the various uses to which they’ve been put, including in hostage situations, I think we’ll find that there have been other examples of this,” says Ryan Calo, a professor at the University of Washington School of Law who studies robotics and cyberlaw. “As far as I know, this is a first time that they’ve used a robot to intentionally kill someone.”
Peter Singer, author of Wired for War: The Robotics Revolution and Conflict in the 21st Century, has studied technology and war since the mid-2000s. He says U.S. soldiers in Iraq have used similar robots to deliver explosives, arming them by duct-taping bombs to the device.
“This would be, to my awareness, the first time that we’ve seen police use a robotic system in this way,” Singer told NPR.
The decision to use the robot has drawn attention for its inventiveness in the face of a challenging and violent situation with few good options. But the incident has also led to calls for the drafting of clearer law enforcement policy about lethal or potentially lethal uses of robots.
Lots of robots
As with much law enforcement technology, robots joined the ranks of police and SWAT teams after a stint in the military.
Bomb robots — known formally as explosive ordnance disposal, or EOD, robots — made their debut in Northern Ireland in the 1970s. They didn’t become widely used by the U.S. until the 1990s, with two of the first models designed by iRobot, the same company that created the Roomba, a robot vacuum cleaner.
Bomb robots became widely used by U.S. troops in Afghanistan and Iraq to deal with improvised explosive devices.
As of 2015, 201 federal, state and local law enforcement agencies had bought at least one explosive ordnance disposal robot through the military’s 1033 program, according to data from the Pentagon’s Defense Logistics Agency. That program distributes excess military equipment to police departments and other agencies across the country. Courtesy of Center for the Study of the Drone
Last year, the San Jose Police Department used one such robot — a Northrop Grumman Remotec Andros F6A — to deliver a phone and a pizza to an armed man on a freeway overpass, eventually talking him out of committing suicide.
Exactly how many bomb detonation robots are being used by police around the country is unclear. The Pentagon’s 1033 program, which supplies military weapons and vehicles to law enforcement agencies across the country, distributed 479 EOD robots between 2006 and 2014, according to an NPR analysis.
According to a 2015 study by Bard College’s Center for the Study of the Drone, the Dallas County Sheriff’s Department bought one EOD robot in 2014 for $10,000. The FBI Dallas Division that year purchased scores of robots at the same per-unit cost, according to the study.
There may be other robots not accounted for by the 1033 program. Northrop Grumman confirmed to Vice last year that more than 1,100 of its Remotec robots are distributed across the country, used in more than 90 percent of police bomb squads.
It’s unclear what kind of bomb robot was used in Dallas this week, though photographs suggest similarities to a Northrop Grumman product. The company declined to comment.
When is it time to worry?
As robots have become an integral part of emergency situation response — think mine collapses, oil spills, natural disaster relief — Calo, the researcher, says he wasn’t surprised to hear of a tactical robot featuring in the Dallas incident.
And he raises an interesting question: Would the police response have garnered as much attention if they had used a more traditional means of killing a violent suspect?
Calo says the time to get nervous about police use of robots isn’t in extreme, anomalous situations with few good options like Dallas, but if their use should become routine.
“I think we get worried when robots start to get used in traffic stops, or stops on the street, when we start to put nonlethal weapons on drones so that the officer doesn’t even need to approach the individual,” Calo says. “Before that, I just think we should have a policy so that officers know what they can and can’t do.”
The effort to develop clear policies may result in a patchwork of local regulations similar to those for drones and body cameras, says Elizabeth Joh, law professor at the University of California, Davis. Plus, she says, it raises new questions about when lethal force is justified or deemed excessive in the world of remote-controlled robots.
ACLU senior policy analyst Jay Stanley says as a legal matter, “the choice of weapon in a decision to use lethal force does not change the constitutional calculus,” though robots make it easier to apply deadly force, raising concerns of overuse.
“This was a makeshift response to an ongoing emergency,” says Joh, “but we shouldn’t be surprised if police departments that are watching this situation decide they want to be proactive and have such a robot at hand should a similar crisis arise,” she says. “The question going forward is, ‘Should this be used again, and when?’ “