Tesla Autopilot Review: Bikers will die
Heather Knight, Roboticist
My colleague and I got to take a TESLA Autopilot test drive on highways, curvy California roads, and by the ocean. In case you don’t live in Palo Alto (where the Whole Foods parking lot is full of these things)… the TESLA Autopilot feature is basically a button to turn the car into autonomous driving mode.
So the car will speed up or slow down based on what’s in front of it, and supposedly stay in the lane or follow the turns of a road automatically.
Autopilot classified ~30% of other cars, and 1% of bicyclists
The purpose of this post is to share my first impressions of this system, particularly regarding its human-machine interfacing. I’m concerned that some will ignore its limitations and put biker lives at risk; we found the Autopilot’s agnostic behavior around bicyclists to be frightening.
But as a human-in-the-loop system, this car’s features would impress Iron Man.
Quick background: Dylan Moore and I work for Dr. Wendy Ju’s research group at Stanford University’s Department of Mechanical Engineering. The group sometimes dubs itself “transformers,” because our research is half social robots, half autonomous driving. We often find that insights in one domain cross apply to the other. Long story short, Dylan and I are familiar with the shortcomings of robot perception systems, and care about interface design.
Since it’s our field, Wendy Ju had us rent a TESLA, that way our group could experience the closest thing out there to consumer autonomous driving today. Naturally, we took it to the beach. For research. I share Dylan and my report card for its features below.
Engineering Sexiness Report
B [DOOR HANDLES THAT RECEDE INTO THE FRAME] — super sexy, but watch your fingers! the car detects the proximity of the keys and automatically locks as you walk away. It does miss some of the satisfaction of actively locking a car, because it is not initiated by you, and there is no auditory confirmation that it is locking.
Note: system will not actually damage fingers.
A+ [AUTOMATIC LANE SWITCHING] — love it: intuitive, reliable, super cool! switch your left-turn blinker on on the highway and the car will wait for an opening and automatically switch lanes. works great and makes sense to user.
B [CURVES] — the car turns too late to cue human trust. Hard to know if it would have worked, we didn’t want to risk it. My phD thesis was about Expressive Motion, so I have ideas of how TESLA could improve people’s trust, but depending on how reliable the car actually is, that might not be a good thing.
C [USER-SET TARGET VELOCITY] — dangerous: autopilot seeks to achieve the cruise-control set speed as long as there is not an obstacle. This works fine on the a consistent street like a highway, but we discovered the hard way when we exited the highway onto a country road, switched autopilot on, and it tried to go from 30 to 65mph at maximum acceleration. Expert users would be familiar with this, but we think Tesla can do better.
A+ [SITUATION AWARENESS DISPLAY] — this display helps the human drivers have a mental model of what the car sees. I’d estimate that Autopilot classified ~30% of other cars, and 1% of bicyclists. Not being able to classify objects doesn’t mean the Tesla doesn’t see that something is there, but given the lives at stake, we recommend that people NEVER USE TESLA AUTOPILOT AROUND BICYCLISTS!
This grade is not for the detection system, it’s for exposing the car’s limitations. A feature telling the human to take over is incredibly important.
C [GIANT TOUCHSCREEN] — hire UX designers, Tesla!! Yes, it’s a big screen. Now make it intuitive to find things… it took us 5 screens to turn off the car. From a usability perspective this is a system for experts not novices.
Note: car automatically turns off as you walk away with keys, but we wanted the confirmation as new users and found the menu depth suboptimal.
F [SELF-LOCKING FEATURE] — we stepped out of the car to take a photo, leaving the keys in the car, and this super capable intelligent car locked us out! FIX THIS BUG! Engineers should account for how people will actually use a technology. The receding door handles made this action seem particularly petulant.
A+ [TESLA APP] — terrifying but awesome: our lab mate unlocked the Tesla from 30 miles away, as he had ridden the car the day before. Beware of a future where you can’t use your car without cell-phone service!
Our Favorite TESLA Features
Winner: The Situation Awareness Display is great because it helps the driver understand shortcomings of the car, i.e., its perception sucks. Providing the driver an accurate mental model of the system probably saves lives, and robots in general would benefit from communicating their limitations to people.
Runner up: The TESLA App saved our butts when we were locked out of the car and Mishel rescued us from the Stanford campus. There could have been worse places to be stuck than by the ocean in Half Moon Bay but we encourage TESLA to fix the self-locking feature.
So in conclusion, and despite the marketing, do not treat this system as a prime time autonomous car. If you forget that… bikers will die.
— Added May 29, 2017 —
Thanks for all the interesting comments, questions, and personal insights on Facebook, Twitter, Fortune.com and Medium. My concern was that treating Autopilot as fully autonomous system might be reckless for a person in a car but fatal to a bicyclist, who has a lot less protection. Encouraging a balanced mental model of the machine is exactly the goal of this article.
To clarify, the car we drove was early 2016 Model S with “Hardware 1,” so there do exist Tesla’s with later versions of the software and additional sensing capabilities. Great question: Were we given training before being allowed to rent the car? The answer is no.
“Whenever I use Autopilot I have to really, really, really watch. I feel almost like I’m babysitting a 3 year old. It’s good for heavy traffic jams on highways that are nowhere near bikes.” -Amber Case
Here’s How TESLA Solves A Self-Driving Crash Dilemma, Forbes.com, by Patrick Lin
Fortune’s followup to this post, by Mathew Ingram
As I said in a comment: A big part of safety is encouraging the human drivers to have the correct mental model of the car, complete with its shortcomings. And yes, I do like Tesla’s.