Autonomous Vehicles Need to Learn Empathy

At Least if We Don’t Want Them to Crash in Intersections

Jordan Elpern Waxman
Jordan Writes about Cities
5 min readApr 3, 2017

--

A diagram of the accident from the police report. V2 is the AV; V1 is the human driver.

Intersections are the most dangerous place on the road. And with little wonder: they are the “planned points of conflict in any roadway system,” in the words of the Federal Highway Administration. According to FHWA and NHTSA data, 23% of fatal accidents occur at intersections, as well as 40% of all crashes and 50% of fatal and injury accidents. Uber’s recent accident in Arizona illustrates why, as well as why it will be hard for AVs to navigate intersections on sensor data and machine learning alone.

The original report on Uber’s March 24 accident went something like this (from The Verge):

“The program came to a grinding halt Friday when a self-driving Uber car was knocked onto its side by another vehicle. Police in Tempe say the self-driving SUV was obeying the law and the driver in the other car failed to yield. That person was cited for a moving violation after the Friday night crash, according to the AP.”

The clarity of the police statement and the speed with which it was delivered made it sound like it was an open and shut case. Carry on, Uber.

But then the other driver, as well as third-party witnesses, made their statements, and a new story came to light:

… Alexandra Cole was making a left turn across three lanes of traffic [just as the light was changing] from green to yellow. The first two lanes were backed up with cars, and Cole crossed them at a speed of about 20 mph. Then, she approached the third lane.

As far as I could tell, the third lane had no one coming … so I was clear to make my turn,” Cole wrote in her testimony. “Right as I got to the middle lane about to cross the third, I saw a car flying through the intersection but couldn’t brake fast enough to completely avoid collision.”

That car was an Uber SUV that employees Patrick Murphy and Matthew Rentz were operating in self-driving mode. The car was going at an estimated 38 mph, Murphy wrote, two miles per hour under the posted speed limit.

“The traffic signal turned yellow as I entered the intersection,” Murphy wrote. “As I entered the intersection, I saw the vehicle turn in left. … There was no time to react as there was a blind spot created by the line of traffic in the southbound left lane.”

[An Uber spokeswoman] said Wednesday that its self-driving vehicles will cross intersections at a yellow light if the vehicle has enough time to do so at its current speed. [emphasis mine]

So we have the AV following the letter of the law, but failing to do what most human drivers would do , which is to think about what the intersection looked like to human drivers coming from other directions — presumably it didn’t need to think about what it looked like to humans coming from its direction, b/c it could “see” them — and what one of those human drivers on the other side of the blind spot would likely given their view of the road. A human would have thought, probably instantly, instinctively, and without even being aware of it, something like the following:

“This next intersection is dangerous. Cars coming in the opposite direction might try to turn left [a driver, human or AV, may even have seen some do so before reaching the intersection], but as I approach these two blocked lanes on my left create will a blind spot that will get worse the closer I get. If they are in my blind spot, I must be in their blind spot, making it doubly dangerous. Because of risk homeostasis, I better slow down and proceed cautiously across the intersection. If the light turns yellow it is almost certain that a car in the intersection waiting for an opening to turn left will try to beat it [the light]. They are probably frustrated from waiting in the turn lane for the an entire light cycle, and maybe many of them, and will try to sneak across traffic at the light change. In anticipation of that, I will yield, because an accident is a worse outcome than having to yield the right of way in violation of my rights. Therefore I will approach extra cautiously and slow to a speed at which I could slam on the brakes if necessary, and avoid a collision at my slow, human reaction times.”

As a result of this caution, the human would most likely end up not arriving at the intersection until the light was well into the yellow phase, stopping and waiting for it to turn green again.

The AV, on the other hand, as programmed by Uber, thinks:

My current speed is sufficient to make it through this yellow light without accelerating. Therefore, I am going to continue through the light at my current speed, which is less than the speed limit, as I have a right to do according to the traffic ordinance. Whomever may be in my blindspot will yield, because that is their duty, according to the right of way rules which say that cars traveling straight through an intersection take priority over those turning.

The AV does not, currently at the least, try to anticipate how the other human might behave in all of her ‘humanness’, because the AV lacks empathy (note the two words that I emphasized in the Uber spokeswoman’s statement, “if” and “will.” Uber is not using AI here, to take into account the subtleties of human driving behavior, but a simple if-then statement). The AV cannot imagine what the road looks like to a human. It’s not clear whether the AV can even imagine what the world looks like to another AV.

The result is predictable:

Uber’s unempathic autonomous vehicle in the background behind an Arizona police officer who seems like he could use a bit of empathy about now .

A human would have instantly gamed it out before reaching the intersection: I know, that they know, that if I am approaching the intersection from their blind spot, they — and the entire left turn lane/area they occupy — are in my blind spot. Therefore I will slow down per the risk homeostasis situation above, while they make it across safely. The AV? So sure of itself that it doesn’t account for the physics of braking just in case the other driver doesn’t do what she is legally “supposed” to.

In other words, AVs need to learn game theory. To learn game theory, AVs needs to learn empathy.

A lot of effort is going into solving the machine learning problems of correctly reading and interpreting sensor data. Maybe we also need to send AVs to empathy school?

--

--

Jordan Elpern Waxman
Jordan Writes about Cities

Cities, transportation, technology, dad. Founded @beerdreamer @digitalbrown @penndigital. Married @adeetelem. Ex-@wiredscore @genacast @wharton @AOL