(Photo Credit by Jake Moore)

The Problem With Self-Driving Cars Is “The Driver”

Vincent T.
0xMachina
4 min readSep 21, 2021

--

According to an article from TechCrunch:

“Tesla CEO Elon Musk said the company will use personal driving data to determine whether owners who have paid for its controversial “Full Self-Driving” software can access the latest beta version that promises more automated driving functions.”

This expands the availability of Tesla’s beta version of FSD (Full Self-Driving software v10.0.1) from a selected group to a wider group of Tesla car owners. Gaining access to the FSD feature’s latest update will only be for drivers who have a good track record that Tesla monitors. FSD is not the same as Tesla Autopilot, which is a driver assistance feature that comes standard. FSD extends Autopilot to deliver more self-driving capabilities. This software version is still in beta, therefore it is not a final stable production version by any means.

If a self-driving car can drive itself, why would it need to have a good driver? That is the question based on why FSD is only available for drivers who must have 7 days of good driving behavior (Tweet by Tesla CEO Elon Musk). That is because the truth is Tesla’s FSD is not SAE (Society of Automotive Engineers) L5 certified for fully autonomous driving (no human driver needed). FSD is very much just marketing, so Tesla’s FSD would be a Level 1 at the least to Level 2 at the most. Therefore, it still requires human driver intervention for proper operation. In other words, it is semi-autonomous.

A driver who is responsible based on their driving record and monitoring from Tesla would be able to use the FSD features. Perhaps Tesla is going to rely on their good driving behavior to address an important issue which is to avoid misusing the self-driving features that Tesla provides. There have been accidents that involved some fatalities in the past that was due to Autopilot use. Whether it was driver negligence or failure of the car’s software, the one thing certain was it jeopardized public safety.

FSD brings even more self-driving features to extend Autopilot. This includes on-ramp and off-ramp navigation, lane changing, summon feature, parking assist and automatic braking (e.g. AEB or Automatic Emergency Braking). Drivers should understand that these are features that still require attention, so there are chances it can fail. Unfortunately, the accidents have been either due to drivers being careless and not following directions or Autopilot failure. This could also be misleading to car owners due to the terms “Autopilot” or “Full Self Driving”, but they should be better educated on these matters.

This brings up another question, why allow Autopilot or FSD on public roads and highways in the first place? This seems to be a gray area since in all instances, finding who is at fault during an accident can be a legal problem. When the accident occurred, was the car in control (Autopilot enabled) or was it the driver (Autopilot off)? Even so, there are some who question if Tesla is liable because of the way they label and describe their self-driving system.

The hope is that self-driving cars can make the roads safer and prevent accidents due to human error. This can help to bring down incidents of drunk driving and sleeping behind the wheel. The accidents should be prevented by the self-driving car’s onboard software. If that not what is happening, perhaps semi-autonomous features should be limited until there is more proven track safety.

It seems the features work fine in many cases (based on author’s experience with Autopilot on Model S), but not everyone has the same experience. If you still have to pay attention to the road while using self-driving features, it will require fast eye and hand coordination and alertness. Perhaps not everyone has that ability. In order to qualify self-driving cars as safe and not a threat to public safety, they should be able to operate without human intervention. That is why if you are going to call a car fully “self-driving”, it doesn’t require a driver at all.

This is why there still needs to be an infrastructure built to support a fully autonomous self-driving car. More powerful microprocessors used with sensor fusion will also help in the long run, as more number crunching is required to get accuracy and precision. With these changes, then the car can begin operations without the need for any driver. It still has to prove its safety on the road, so that is the final test. Then we can begin calling cars as fully self-driving.

--

--

Vincent T.
0xMachina

Blockchain, AI, DevOps, Cybersecurity, Software Development, Engineering, Photography, Technology