Vision Correction: Identifying the Best Way for an Autonomous Vehicle to “See” the World

Phillip Wilcox
The Startup
Published in
9 min readFeb 9, 2021

This week’s article will be the first article in a four-article series about the technological innovations of autonomous vehicles which allows a vehicle to drive itself! The material for this article comes from my book, The Future is Autonomous: The U.S. and China Race to Develop th Driverless Car.

One disclaimer, my background is in policy and economics. I am not an engineer. However, I have been very grateful to have received input from two very experienced engineers who work on autonomous vehicles. Therefore, this article, as well as the next three in the series describe the basic technological components.

This article describes the debate between cameras and LiDAR sensors. The cameras and LDAR sensors represent the “eyes” of the vehicle, the sensors that provide the data for the automated riving system to perceive the world around it and then make decisions for how to drive from there.

LiDAR Sensors: Essential for Autonomous Driving or a “Fool’s Errand?”

When Elon Musk talks, people listen. He built Tesla to be one of the largest electric vehicle companies in the world. He also introduced many consumers to semiautonomous driving with Tesla vehicles’ level two autopilot feature. Therefore, when he said “LiDAR is a fool’s errand. Anyone relying on LiDAR is doomed. Doomed!” at Tesla’s recent Autonomy Day, it made headlines.

A less boisterous CEO from a startup company working on autonomous vehicles in China, Dr. Jianxiong Xiao (Professor X) shares a similar opinion. He does not share Musk’s strong anti-LiDAR sentiment necessarily. He does argue other companies in the autonomous vehicle industry underestimate the potential of a camera-focused autonomous driving system. He also claims LiDAR sensors are not necessary for vehicles to drive fully autonomously.

LiDAR sensors stands for Light Detection and Ranging sensors. The bulky, honeycomb-shaped object attached to the roof of the vehicle is a LiDAR sensor. LiDAR sensors and cameras are the “eyes” of the autonomous vehicle.

Most companies working on autonomous vehicles in both the US and China disagree, including Waymo, Ford, GM Cruise, Uber, Baidu,, and Didi. They believe LiDAR is an essential part of the sensor stack (LiDAR sensors, cameras, GPS, radar, etcetera) of products for a vehicle to drive autonomously. They argue LiDAR data is equally as crucial as camera imaging data. It is essential for autonomous vehicles to have as many different sensors and cameras as possible for them to accurately perceive the world around them. Why do Musk and Professor X believe LiDAR is unnecessary? Would a vehicle really be able to reach level four or five autonomy (fully autonomous) without it?

LiDAR works much like radar, but instead of sending out radio waves it sends out pulses of infrared light “lasers.” These lasers are invisible to the human eye and the sensor measure how long they take to reflect back to the sensor after hitting objects. It does this ten times per second and compiles the results into a point “cloud” that creates a 3D map of the world around the vehicle in real time. The resulting map can identify not only how far away objects are from the car but also what the objects are (another car, a person, a tree, etcetera). The vehicle’s computer “brain” can then predict the object’s behavior and how the vehicle should drive to avoid the object.

The technological jargon of how the LiDAR sensors work to create the 3D “point cloud” is confusing to the layman. However, LiDAR sensors have recently been installed on popular household cleaning appliances, and the actual operation of what they do is easier to grasp than the technology behind how they do it. I have a friend who is a busy attorney and doesn’t have time to clean his beautiful apartment. In addition to dust, often papers and clothes are on the floor.

One day he was going to buy a vacuum that would run by itself. It sounded like a great idea and I was very excited. I was skeptical whether it would work. I stated, “I’ve heard these vacuums constantly get stuck and only clean a small section of your floor.” He replied, “No, this one is different. It’s more expensive, but it has this LiDAR system to make it follow a specific route and prevents it from getting stuck.”

No way, I thought. I had heard of LiDAR sensors before in my research for this book. I took the ad he was holding and glanced at the vacuum he was telling me about. He was right. While more primitive than a vehicle, the technology operates in the same way. The vacuum sends out infrared light “lasers” to map its path. It then travels along this path, avoiding any obstacles along the way. The signals it receives also ensure it does not collide with anything or get stuck in a corner.

LiDAR gives autonomous vehicles “superpowers.” It gives them the ability to have a continuous, 360-degree visibility of the world around them. Because the automotive LiDAR system spins at over six hundred rpm and emits millions of light pulses per second, the car literally has eyes in the back of its head. LiDAR sensors also allow a vehicle to always know the precise distance of objects from the car, to an accuracy of two centimeters. Radar is then used to determine how fast these objects are moving.

The Debate Begins. Are LiDAR Sensors Necessary?

Aeva, a LiDAR sensor company which provides LiDAR sensors to vehicles, has a partnership with Audi. Audi is also trying to develop autonomous vehicles. The co-founder of Aeva, Soroush Salehian, believes LiDAR sensors are essential for autonomous vehicles to drive safely. He says, “We believe that for a safe stack you must have all the outputs to make the right decision.” He then says, “If you’re using just a camera sensor, there are so many issues that may blind the system. You need to be able to compliment the different sensor technology weaknesses. Get both outputs and it’s up to the carmaker to decide what to do with that information.”

What is evident in his statement is a LiDAR sensor is not the only sensor. He also mentions radar and cameras. However, LiDAR is the most important part of the “safe stack.”

Why do Tesla’s Elon Musk and AutoX’s Professor X criticize this seemingly miraculous technology? In short, what do they know that the rest of the industry does not? For one thing, LiDAR is a relatively new technology, at least for vehicles. LiDAR sensors were first developed in the 1960s by NASA. They were used in 1971 by the Apollo 15 mission in which astronauts used it to map the surface of the moon. As one of the first glimpses the public had of the surface of the moon, LiDAR’s potential seemed limitless.

Before LiDAR sensors were even considered for autonomous vehicles, they provided very useful data for archaeological ventures and agriculture. This was because of their ability to accurately map large scale plots of land. It wasn’t until the 2005 DARPA Challenge, where companies and organizations raced fully autonomous vehicles, that the first automobile, Stanley, used LiDAR sensors. The leader who developed the Stanley vehicle went on to lead Google’s self-driving car program, details of which will be in my future article on John Krafcik and Waymo

This chapter will analyze the debate between a camera-focused autonomous driving system and the system including LiDAR sensors. There is no one perfect answer to this problem of how an autonomous vehicle perceives the world around it. The solution will be to combine the strengths of the different sensors and attempt to minimize all of their weaknesses through merging the data in a process known as “sensor fusion.”

Both Cameras and LiDAR Sensors Have Limitations

Professor X has several complaints about LiDAR sensors, one of which relates to durability concerns. When asked about the durability concerns of LiDAR sensors, Professor X said, “LiDAR doesn’t cope well in extreme conditions. Hot and cold temperatures can throw off the sensor calibration, which could disrupt the data produced by the sensors.”

Professor X also points out it would be very difficult to have an automotive-grade hardware system with LiDAR last for more than four or five years. This would be assuming it was being used as part of an autonomous “robotaxi” fleet and running almost 24/7. This is because of the huge energy demand of the LiDAR sensors, which leads to wear and tear for the vehicle hardware.

I spoke with Autonomous Vehicle Technical Manager Brian Jee about the issues related to an autonomous vehicle visualizing the world around it. He stated no vehicle will be one hundred percent safe, even if the vehicle has all of the cameras and sensors in the “safety stack.” There is a problem with different weather conditions. He revealed potential problems by asking, “What happens if it’s raining? We’re talking about all this stuff only on sunny, clear days and as soon as there’s inclement weather it’s all out the window. We don’t have a solution for that. It’s all band-aids.”

Rain or fog can decrease the visibility of cameras and the car becomes roughly five or ten percent less safe. If a bug hits a camera and blocks its view, then the vehicle becomes a certain percentage point less safe. This is why companies like Waymo, Tesla, AutoX, and Baidu test their autonomous vehicles in different places with different climates. They want to expose their vehicles to different weather conditions to see how they will react.

Professor X is not the only researcher who shares the concern about LiDAR sensors’ durability. In an interview with The Telegraph in London, John Rich, the operations chief at Ford Autonomous Vehicles, stated, “We will exhaust and crush a car every four years in this business.” This statement has some caveats. The Ford autonomous vehicle will likely be a hybrid car rather than an electric car. Hybrid cars are more complex with more wear and tear, even under normal driving conditions.

Ford, and other autonomous carmakers, will be operating a shared “robotaxi” service, at least at first. This will be used by autonomous vehicle companies as an initial way to make money because the vehicles would be too expensive to be commercially viable to individual consumers. The engines will be running almost 24/7 to continuously pick up and drop off new customers. Even with these caveats, it is still noteworthy that one of the largest automakers in the world is concerned about the longevity of LiDAR sensors.

The image resolution can also be an issue with LiDAR sensors. According to Professor X, “The sensors have a lower resolution than even the cheapest cameras — sixty-four pixels vertically, compared to a VGA camera which has a vertical resolution of four hundred eighty pixels.” The classic example is there could be a plastic bag floating in the air and because of the poor image resolution, the vehicle could think the bag is actually a tire flying toward the vehicle and swerve to avoid the bag. This might cause an accident.

As described by Mr. Jee, currently no clear options exist for inclement weather situations like rain, sleet, or snow. Autonomous vehicle companies have been working on ways to provide a solution to this problem. Each individual camera or sensor has its own specific problems or limitations. However, when they are combined they offer a potential solution. Multiple cameras or sensors can work together by a process known as “sensor fusion.”

Learn more about the different sensors that serve as the “eyes” of autonomous vehicles in my next article. This article will also discuss the complicated technological method of using multiple sensors in “sensor fusion” to more accurately perceive the world around it for the vehicle’s automated driving system.

Please join me Friday February 12 from 5:30–7pm EST for my book launch party for my book The Future is Autonomous: The U.S. and China Race to Develop the Driverless Car. There will be a Q&A about the content of my book, the book writing process for the entire writing journey, prize giveaways, and enjoy a general festive, party atmosphere! This is also Lunar New Year! So Happy New Year everyone!

Here is the event information from Eventbrite: