Robots that kill

Enrique Dans
Enrique Dans

--

The device in the picture is a Remotec Andros, a robot made by Northrop Grumman and used by security forces to deactivate bombs. At a cost of $200,000, it was one of three the Dallas police force had bought recently to increase its ability to deal with possible terrorist bomb attacks.

But on Friday, the robot was used not to deactivate a bomb, but exactly the opposite: to carry and then detonate an explosive device, presumably of the kind used to provoke a controlled explosion, where Micah Xavier Johnson, the sniper responsible for killing a number of people during a peaceful protest march in Dallas on Thursday night.

The decision, possibly the right one given that Johnson had military training and was presumably heavily armed, has caused controversy, with some pundits dubbing it the beginning of the Robocop era.

In reality, the US military already makes use of semi-autonomous robots to kill people: “the drone war” unleashed by the Obama administration in Pakistan, Yemen and Somalia, countries that the United States is not at war with, but that provide refuge to its enemy Al-Qaeda. The attacks are believed to have killed at least 1,270 civilians by mistake, and there have been a number of high profile cases involving the deaths of hostages during rescue missions.

But the police’s use of a robot armed with a bomb to kill a suspect is without precedent, and raises other possibilities, such as using a robot controlled water cannon, a Taser-type electric stun gun, or some kind of chemical agent to incapacitate people.

The robot used yesterday, like military drones, was being remote controlled by a police officer, but it is easy to see how we could soon reach a situation where autonomous robots programmed with artificial intelligence could be used in armed conflicts. Warnings of the dangers of such an approach have already been raised by people like Elon Musk, Steve Wozniak and Stephen Hawking, and who have called for ethical guidelines to be established.

Using robots in difficult, fast-moving situations as happened in Dallas could sometimes make sense. But if this is to be the case, we will have to assume that such situations are going to be regular events, meaning that these machines will have to be designed in such a way as to avoid danger to the police and the public, while working toward preventing deadly outcomes, in every sense.

(En español, aquí)

--

--

Enrique Dans
Enrique Dans

Professor of Innovation at IE Business School and blogger (in English here and in Spanish at enriquedans.com)