Photo courtesy of Dr. Beaslys

Dear Elon, Turn Off Autopilot When Drivers Use Their Cellphone

People don’t always make the best decisions.

Although Tesla’s new Autopilot feature came with clear warnings for drivers to keep their hands on the wheel at all times and to stay alert incase they need to take back control of their vehicle, people of course didn’t listen. What’s worse, many started recording videos of themselves with their hands off the steering wheel with Autopilot enabled and one man even turned on the feature and then climbed into the backseat of his car.

Tesla, although initially stating “we trust our customers and we expect them to be responsible”, has admitted that not all of their drivers cannot be trusted:

“There’s been some fairly crazy videos on YouTube … this is not good. And we will be putting some additional constraints on when Autopilot can be activated to minimize the possibility of people doing crazy things with it” — Elon Musk

It’s somewhat comforting to hear that they’ll be working to ‘minimize the possibility of people doing crazy things with it’ but what about people doing ‘normal’ things with it?

And so, while the people at Tesla are investigating ways to minimize reckless behavior using Autopilot, I would like propose a restriction that is not just for the crazies but for the masses:

Turn Autopilot off when people are actively using their cellphones.

Distracted Driving is a Big Problem

I hope by now, people have begun to realize how big of a problem distracted driving really is.

In 2012 alone, 3,328 people were killed in crashes involving a distracted driver and an additional, 421,000 people were injured in the United States.

Unlike the reckless and irresponsible drivers who were testing the limits of Autopilot, distracted driving is something that is common mistake almost everyone makes and it already causes thousands of deaths a year.

But Doesn’t Autopilot Protect From Distracted Driving?

For some, the idea of turning Autopilot off when someone is distracted may sound counter-intuitive.

Autopilot, in theory, eliminates the harms of distracted driving by taking control of the vehicle and avoiding danger. In many ways, it seems like it was designed specifically to address distracted driving.

The problem, however, is that Autopilot is not yet fully baked. Crazy YouTube videos prove this and even Tesla themselves admits this.

As drivers become more used to Autopilot, they’ll start to become comfortable cheating with it more and more often and their distracted sessions will last longer. This could turn many people who would very rarely use their phone while driving into those who do it habitually.

Until Autopilot has shed its beta label and is safe to be used autonomously, it will only make distracted driving worse.

How would it work?

If drivers are using their phones, Autopilot would immediately be disabled with some sort of audio/visual indication made to them.

Technically, there may be a number of ways to do so. Here’s a possible suggestion: Any Tesla vehicle with Autopilot could require the Tesla mobile app installed on the driver’s iOS or Android device. When installed and the phone is paired to the car, the vehicle could monitor accelerometer readings on the phone (similar to how Fitness trackers do) to check for any major movements such as when a driver picks up their phone. An even simpler way to do so would be to monitor if any app is actively in use, which is possible on Android but I do not believe is yet available on iOS.

A real problem for real people

Coming from a Machine Learning background, I can understand and appreciate the allure of releasing this feature into the wild to collect data so that it can be improved. It’s a chicken and egg problem. Autopilot won’t ever be ready until it has massive amounts of real world data but it can’t get that massive data until its ready.

The ethics around making the public beta available before being fully ready is not something I wish to debate in this post. I would, however, like to argue that the least Tesla can do during the beta period is to be cautious and address common and expected behavior among it’s drivers.

Distracted driving is inevitable. Autopilot will only encourage drivers to be drive more distracted and so until it’s fully safe, the least Tesla can do it try and limit how it’s used.

Help spread the word

If you agree with the idea, please share this post and recommend it in hopes that hopefully someone at Tesla will see it. Even better, if you know someone who works at Tesla then please forward the article to them directly.


UPDATE:

It’s been great to receive feedback and discussion on this piece. It’s a reflection of the Medium community and the type of meaningful discussion it can bring forward (Thanks Ev and team!).

I wanted to address a few responses and ideas.

(1) Why not make a simpler solution where the user has to keep a Tesla specific app open to enable Autopilot?

It would be a big hassle to ask people to do that and many just wouldn’t use Autopilot then. People want quick in and out access to their vehicle which is one of the reasons Bluetooth pairing works well. It needs to be done automatically.

(2) What if there were sensors built into the steering wheel?

This would of course cost money and additional hardware upgrades but, more importantly, it wouldn’t always be reliable. Many drivers, I could even argue most drivers, use one hand when steering for majority of their ride. Therefore, if a single hand is allowed, the second hand could easily be using a cellphone.

(2) Turning Autopilot off when someone is using their cellphone is counterintuitive!

I had thought I addressed this point in my piece but I think a few people may have overlooked it. Until Autopilot is complete and out of beta, it should be contained to situations where driver attention is most likely.

Let’s remember what Autopilot is actually capable of: taking control of your car. It isn’t like a lifejacket where its chance of failure is only critical when someone is in need of it when drowning or needing to stay afloat. Autopilot can actually steer your car into an accident.

Just take a look at this video to see what I mean:

This driver, unlike other irresponsible drivers who were simply trying to test Autopilot’s limits, was actually following instructions that Tesla gave him. He had his hands on/near the wheel and was looking at the road attentively. Then Autopilot almost steered him into an oncoming vehicle and he needed to take control. Now imagine if he had quickly glanced at his cellphone and become accustomed to Autopilot fixing his small mistakes?

And let’s remember that Autopilot’s failure doesn’t only affect the driver, it affects everyone on the road.

Just think about it: Would you feel comfortable swimming in a lake that has boats with experimental technology that not only can but is actually known to fail from time to time? It’s the same situation here. The least Tesla can do is to make the feature more restrictive until they feel comfortable it’s ready to be used.

(3) This sounds cumbersome and a bad user experience. What if a passenger is trying to use my phone?

That’s actually the point. The point is to restrict Autopilot’s use so it’s enabled in less situations until it’s reliable. People shouldn’t yet become accustomed to relying on the feature but rather it should be additional safety measure to their driving, it should be a bonus not an expectation.

So if a passenger is using your phone, then yes, Autopilot would be disabled momentarily. The driver should already be in control of their vehicle and Autopilot should only be assisting so it’s not critical to have it enabled at all times. It’s a small price to pay when you consider the alternative.


Thanks to those who responded and I look forward to hearing more insightful thoughts.