Sensors in Autonomous Vehicles

Rishi Mehta
The Startup
Published in
6 min readSep 12, 2019

--

We have all heard of what sensors are. They might be attached to your robot or favorite toy. The fact is that they are everywhere. But who would have ever imagined that those sensors you were playing with as a kid have evolved so much and is now the key behind autonomous vehicles?

When we say sensor we mean any sort of device that can measure a certain property. These can include pressure, position, acceleration, etc. The reason we use sensors is that humans are evolving at a crazy rate.

Our technology is getting more and more complicated and so are the problems that we are faced with. Sensors are one of the many things that not only makes things convenient but also shows how advanced our society is.

However, you might ask yourself, how do these tiny sensors from my robotics camp help make our daily lives easier? Surely these tiny sensors can’t be the reason that autonomous vehicles can drive without a person operating it. Well, the answer depends on how you look at it.

Diagram of what a typical self-driving car would look like

From good to great

If you have ever wondered how autonomous vehicles know where they are and when to stop while moving, the reason is because of sensors.

Throughout history, we have seen several concepts go from something ordinary to some of the most useful innovations in history. The point is that sensors went from being used in toy cars to real cars.

However, these tiny, minuscule sensors are not what make autonomous vehicles do what they can. To figure out what goes into that we need to look into specific types of sensors.

Sensors in Self Driving Cars

In a regular autonomous vehicle, three main sensors are used to guide an autonomous car.

While there might be other sensors on a car such as a Coolant temperature sensor or an Intake Air Temperature sensor, they are not directly related to autonomous vehicles.

The Camera

Whether it is photos or videos, cameras are the most reliable way a computer or machine can see something. Most regular cars already have cameras on the side and back.

The reason behind this is fairly similar to self-driving cars, to provide a visual representation of the world in where drivers can’t see.

The difference is that autonomous cars can’t see anything without being programmed to see something.

That is why there are cameras located in the front, rear, left and right. This almost gives a 360° view of the car.

Sometimes, cars have special types of cameras to help it in certain areas, such as a fish-eye camera. These cameras contain a special type of lens that provides a panoramic view.

This gives a wide view of what’s behind the vehicle for it to park itself.

The Radar

As most of us know, radar technology has been around for quite some time. One of the early instances of radar was back in 1904 when German inventor Christian Huelsmeyer patented his ‘telemobiloscope’. It could detect ships up to 3000 m away.

Going 100 years into the future we now see that radars are used to calculate velocity, angle, and other important properties of objects in air, water or on land. However, its application into autonomous cars is by far one of the most interesting.

The first question you might ask is why would you use radars if cameras are so much more dynamic?

The answer is that radar sensors can supplement camera vision in times of low visibility, like night driving or if the camera is covered.

Radars transmit radio waves in pulses. Once these waves hit an object they bounce back to the sensor, giving information about the velocity and location of the object.

Similar to cameras, they surround the car so that the car can see at every angle. However, they can’t distinguish different types of objects, that’s why there are cameras.

The LiDAR

You might think that if you have cameras for object recognition and radars for object detection, then that should be all we need to build a fully functional autonomous vehicle.

With these two systems, you can build a level one or maybe level two autonomous vehicle. Unfortunately, they don’t cover all the aspects of driving that a human would encounter.

This is where the lidar sensor comes in. Lidar is an acronym of Light Detection and Ranging. If we ever want to achieve a fully driverless car (level 5 automation), we must get used to using lidar sensors.

These unique sensors make it possible to have a 3D vision around the car. Similarly to radars, it works just as well in low-light conditions or when the cameras are covered.

However, unlike radars, it provides shape and depth to surrounding cars and pedestrians as well as the road geography. This is one of the major reasons that lidar sensors should be used with radars, as opposed to just radars.

A lidar system calculates how long it takes for the light to hit an object and reflect back to the scanner. The distance is then calculated using the speed of light. These are known as ‘Time of Flight’ measurements.

Lidar systems can fire around 1,000,000 pulses/lasers per second. Each of these measurements, or returns, can then be processed into a 3D visualization known as a ‘point cloud’. This is how a lidar sensor processes its surroundings.

Putting it all together

As we have seen, lidar, radar and cameras give crucial information to the car to order to make decisions. Unfortunately, it is not as simple as just putting sensors on a car and it magically makes decisions based on the parameters that it is given.

Much like the human brain processes visuals taken in by the eyes, an autonomous vehicle must be able to make sense of what it sees with its sensors, which are the human’s eyes in this analogy.

Self-driving cars do this by using a process called sensor fusion. As the name suggests it essentially joins all the information from the sensors together and makes decisions from there. The inputs from the sensors are fed into a high performance, centralized AI computer. An example of this is the NVIDIA DRIVE AGX platform.

NVIDIA DRIVE AGX platform

The best part about sensor fusion is that there are several inputs to rely on, rather than just one main one. This way if something fails and one input does not work, there are others to fall back on, further ensuring safety.

It also makes it so that when making decisions the car can use several parameters to further ensure that it’s decision is the right one.

For example, if a car needs to slow down because of a crossing pedestrian, if the radar sees a person and so does the camera, it makes the car much safer because it uses several inputs.

The End?

Is that it, is all sensors were meant to do is be used for autonomous vehicles? Honestly, it doesn’t matter. Sensors are some of the most intuitive and impressive concepts we have seen in our history!

When we look back in 20 years, we will truly understand how important sensors were to the field of autonomous vehicles. However, we must understand why sensors are so important to us, and how it might be applied to different fields.

There are so many applications of sensors in the future that we probably have never thought of. It is incredible to see the number of possibilities that are open as soon as we bring sensors into the equation.

This is a world where people are working hard every day to try and do something new. I believe that people should see the number of possibilities as a table with 50 different ingredients.

There are so many different combinations, that have been done or taste terrible. However, there might be maybe a few combinations that are completely new and taste amazing.

--

--

Rishi Mehta
The Startup

17 y/o working on building a fall detection system for seniors | fallyx.com