Image: Tesla

‘Hold Steering Wheel’: Tesla’s approach to safety

Martin Colebourne
tobiasandtobias

--

How did the conversation go, I wonder? Actually, it cannot have been a single conversation, the issue must have come up more than once: what do we do if the driver fails to keep their hands on the wheel?

Tesla has been pushing autonomous driving as a core part of its vision from the very beginning and they included an early version of their ‘Autopilot’ feature from October 2014 onwards.

Despite the name, which seems to promise fully-capable self-driving, the feature was relatively restricted. It could take full control of the car’s acceleration, braking and steering, but only on particular roads: large freeways, with a solid central divider. Why these roads? Well, although speeds are higher, these roads are much simpler than other types of roads; the interactions with other traffic are more constrained, junctions follow a specific plan and obstructions are less common.

Whilst capable of self-driving in this environment, the system was not perfect, so there was a caveat — drivers must keep their hands on the wheel, ready to take over if something goes wrong.

So the question for the engineers is: what do we do if the driver fails to keep their hands on the wheel? To begin with the answer is simple — give the driver a visual and auditory warning, telling them that they must keep their hands on the wheel. This should be fine for most people, but what if the driver simply ignores this? Or alternatively, if they put their hands back momentarily, to cancel the warning, and then just take them off again?

I pitched the problem to one of my colleagues. After a moment’s consideration, she said: “I suppose you would just have to brake the vehicle and bring it to a stop.”

It seems pretty obvious doesn’t it? You have a safety-critical system, which the driver must monitor to ensure its safe operation, but they ignore warnings and fail to do so. This renders the entire operation unsafe, so the automated system should refuse to function, shouldn’t it?

In fact, there are several things you could do:

If the driver simply fails to place their hands back on the wheel, then it would make sense to bring the vehicle to a safe halt — after all the driver may actually be in trouble, they may have collapsed, for example.

If the driver places their hands back on the wheel in order to cancel the alarm, but then takes them off again, you can take a different approach. You could give them a warning that Autopilot will disconnect and the next time their hands return to the wheel, place the car into manual mode and disable Autopilot until they bring the car to a stop, turn it off and turn it back on again. This inconvenience would reinforce the need to comply with the requirements.

In the event that someone showed a pattern of ignoring the warnings, you could even disable the Autopilot feature completely, until they returned to a dealership to have it reset. That would provide a much greater incentive and give an opportunity to talk to them to reinforce the need for care.

So what did Tesla’s engineers decide to do when faced with this question? Well, they decided that in the event the driver ignored the warnings, they should just — warn them again.

That’s it.

If the driver fails to put their hands back on the wheel, or just repeatedly ignores the warnings, then just warn them again. Do you see any problems with that?

Online videos soon emerged of early Tesla owners filming themselves driving along in Autopilot — pulling stunts and messing about. Clearly they were not carefully supervising an imperfect, early implementation of autonomous driving. In other words, the technology allowed people to behave irresponsibly and they did so.

In May 2016, a Tesla driver in Florida engaged Autopilot on a freeway, but one without a central divider. Remember that restriction in the types of roads that it could be used on? Well, there was nothing in the vehicle to ensure that this was adhered to.

Why was a central divider so important? Because the system was not designed to detect crossing traffic — vehicles moving in a perpendicular direction across the carriageway.

The driver then repeatedly ignored the warning to put his hands back on the wheel. According to the National Transportation Safety Board (NTSB) investigation, during a 37-minute period of the trip, the driver had his hands on the wheel for just 25 seconds. The system provided 13 warnings to the driver, to which he responded after an average delay of 16 seconds.

The conditions for disaster were set. Approaching a slight rise in the road a large truck coming from the other direction started to turn across the road to enter a side road. Autopilot failed to detect the obstruction (it had not been designed to); neither did the driver, who had not been forced to adhere to the safety requirements.

The vehicle ran at full speed into the truck, killing the driver instantly.

What are we to make of Tesla’s attitude to safety? They were happy to release a highly limited form of autonomous control for the public to try out. And they took the decision to do nothing in the event that the driver ignored the requirement to monitor its use. The question is — has anything changed?

By National Transportation Safety Board (mtn view tesla scene graphic) [Public domain], via Wikimedia Commons

In March 2018, a Tesla driver in California was killed when his vehicle crashed into a concrete lane divider whilst in Autopilot. The preliminary report from the NTSB reports that the driver had received multiple warnings to keep his hands on the wheel earlier in his journey and that his hands were not on the wheel during the final 6 seconds. Far from avoiding the impact, the vehicle stopped following the car in front and accelerated during the final seconds.

How has Tesla responded? They frustrated authorities by releasing details of the crash to the press, emphasising the fact that the concrete barrier had been previously damaged. Whilst this certainly increased the severity of the damage to the vehicle, this focus seems designed to distract attention away from the failure of the Autopilot system.

Tesla also sought to deflect blame onto the driver, pointing out that the divider would have been able to see the concrete divider well in advance of the accident. Elon Musk, CEO of Tesla told CBS: “The system worked as described, which is, it’s a hands-on system. It is not a self-driving system.”

This hardly seems a reasonable position when the system has been designed to be so tolerant of drivers who are not paying full attention. Tesla continues to seem more interested in pushing their technology than in taking a strong stance on safety.

--

--

Martin Colebourne
tobiasandtobias

Martin writes thought-provoking essays on science, philosophy, politics and design that nobody reads.