Issues with Tesla’s Smart Summon feature.

Smart Summon And The Issues With Self-Driving Cars

Vincent T.
0xMachina
Published in
6 min readNov 4, 2019

--

Tesla’s Software Version 10 delivers the Smart Summon feature. It allows any Tesla EV model to function autonomously by driving itself from the parking space to the car’s owner. It is a cool feature to have and seems fun to use. When you do a reality check, it could be otherwise. This is just one of many types of autonomous driving features developed by automakers. The more popular is the Self Parking car feature which allows the driver to take their hands off the wheel while the car’s own system assists in parking the vehicle. So far that has worked, without any major incident or too much concern, though accidents can still happen when other motorists are not aware.

What Is Smart Summon?

Smart Summon was designed to deliver some autonomous feature. It improves upon the basic Summon feature which originally allowed the Tesla to roll out of its parking space to give more room for the owner to enter. This worked well in tight parking spaces or when it was convenient to do so. It did not require the car to move that far other than just roll up to the owner, so it was quite safe. Smart Summon takes it to another level by allowing the car to drive itself to the owner but within a limited distance (maximum of 200 feet). Just think about how convenient it would be if on a rainy day, you can just summon your car to drive to you from the parking lot to pick you up.

There have been many videos posted online that show the idiocy of some Tesla owners showing off the feature. People do like to play with their toys and sometimes curiosity just gets the best of them. The problem is the way people are using the feature and the fact that Tesla is allowing them to use it that way. It seems that owners are trying to use the feature as advertised, but are failing. There have been accidents reported already. I was initially impressed when I first saw the video from Tesla demonstrating the feature. I just thought about how great it would be to show off to friends though, so that was probably not the intent for this.

Take for example a car owner using Smart Summon to summon the car from half way across a busy parking lot. Now Tesla’s cars are Level 2 at most, semi-autonomous. What ends up happening is the car slowly stumbles its way out of the busy parking lot toward the owner. The feature may work given less variables affecting it, but in a real world scenario with busy people and cars it is less likely to succeed and that has been what is happening. The car owners end up intervening at the last minute or just walking to their car. There have been some close calls while the Tesla abruptly stops when it detects a nearby moving vehicle. Motorists on the other hand don’t know how to react because there is no driver in the Tesla. This is not natural at all.

This leads to more confusion as other motorists try to figure out what the Tesla is trying to do. In polite society, when a motorist is pulling out they can gesture to another car to either wait or move along. With a driverless car, that just is not the case. The Tesla also has problems finding its way to the owner, which can be problematic at times. Now how can this be? The easy explanation is that the feature is still in its infancy and is not quite ready for mainstream use. It still has more room to improve but I think if that is the case, it should not even be allowed at all.

Forbes magazine reported:

“Unfortunately for Tesla, the release resulted in several owners uploading videos of the cars making mistakes in smart summon mode, and even in a few cases damaging themselves hitting things like garage doors and some other vehicles.”

According to Elon Musk’s tweet, Tesla owners have already used the feature at least 550,000 times. That was just in the first few days since the feature was released with Software Version 10 (October 2, 2019). Its just like how new features on smartphones or any other device receives plenty of attention during first release. It seems to work for most owners who used the feature, but it is still far from stable. Reactions do vary from amazement to disbelief when the Smart Summon didn’t quite work out well.

Tesla is already gathering the data from the Smart Summon experience. Even Elon Musk knows how many times it is being used. As people continue to use the feature, Tesla will learn more about what to improve and how to hopefully increase its safety to the public. According to Tesla, owners should still be responsible for using the feature.

“You are still responsible for your car and must monitor it and its surroundings at all times within your line of sight because it may not detect all obstacles. Be especially careful around quick-moving people, bicycles, and cars.”

How Tesla Self-Driving Works

To better understand how Tesla’s self-driving works, take a look at the diagram which illustrates the technology.

Tesla’s self-driving technology illustrated.

Unlike its competitors, Tesla does not use LiDAR (as of this writing). Instead, Tesla’s sensor fusion consists of 8 cameras (forward, narrow, side and rearview) that covers 360 degrees of visibility. There is also a radar system for detecting objects that are not visible to the camera system and 12 ultrasonic sensors that surround the car. The ultrasonic sensors detect much closer objects like when parking or driving in a lane on the freeway.

The software used for Smart Summon makes use of these cameras and sensors to navigate its way around the parking lot. It is a brilliant idea in theory, but in execution there are still issues. One thing that self-driving cars have not been good at handling are road courtesy and accurate anticipation of what other drivers are going to do. A Tesla can abruptly slam the brakes which could confuse other motorists as to what it wants to do. This can then lead to some mishaps since what self-driving cars will do next depends on what the sensors see and what the software is programmed to do.

Fixes On The Way

Tesla is addressing the issues with new software updates, just like when bugs are discovered in Windows, the company provides updates to fix the problem. Real-time monitoring and feedback is what allows these fixes to be made. Tesla collects user data to help them with improve and design their products. Tesla owners are in a way like beta testers providing data about how Smart Summon is working. What Tesla found out is the feature does have glitches which is why they are issuing a fix for it.

Final Thoughts

With beta software like this, it seems understandable but it can also be dangerous. If someone were to try using it on the freeway it is obviously negligence on their part, but can Tesla be blamed for providing the feature? That has me questioning it because there is a saying that just because you can do something you should do it. It is like if you have Smart Summon, you should use it too. I would be more worried about the people who use it without knowing how it works. Always read the manual or consult a Tesla rep for support. Just don’t get into an accident or you will be responsible. To be on the safe side, not using it removes anyone from harm.

Humans have intuition, which comes from the lessons learned from driving school and the best teacher of all … experience. It’s like self-driving cars need to learn these things in order to be able to drive safely with the same level of alertness as a human driver. That is something no self-driving car has officially achieved at the moment.

--

--

Vincent T.
0xMachina

Blockchain, AI, DevOps, Cybersecurity, Software Development, Engineering, Photography, Technology