Autonomous Vehicles: The ins and the outs

Well, autonomous vehicles have been in the news and the tech community for quite some time now. Made a household name by Tesla Inc., every major company in the tech business now has a division based on them. But what are they actually?…how do they actually work?

What are autonomous vehicles?

Autonomous vehicles are cars or trucks in which human drivers are never required to take control to safely operate the vehicle. Also known as robot or “driver-less” cars, they combine sensors and software to control, navigate, and drive the vehicle.

Contrary to popular belief, there are 5 levels of automation :

  1. Driver Assistance: In this, the only automated task is either steering or acceleration/deceleration, with the expectation that the human driver will perform all other tasks.
  2. Partial Automation: Both the steering and acceleration/deceleration is controlled by computers, with the expectation that the human driver will perform all other tasks.
  3. Conditional Automation: The automated driving system takes care of all aspects of driving with the expectation that the human driver will respond appropriately to a request to intervene.
  4. High Automation: The automated driving system takes care of all aspects of driving even if a human does not respond to a request to intervene.
  5. Full Automation: The automated driving system can drive in all conditions and environments and does not need a human at all.

While most companies dabbling in this have achieved the third level of automation(with some advanced versions achieving the fourth level), currently, there are no legally operating, fully-autonomous vehicles in the world.

All the current models of Tesla fall under level 3.

How do autonomous vehicles work?

A self-driving vehicle is capable of sensing its environment and navigating without human input. To accomplish this task, each vehicle is usually outfitted with a GPS unit, an inertial navigation system, and a range of sensors including laser rangefinders, radar, and video.

The vehicle uses positional information from the GPS and inertial navigation system to localise itself and sensor data to refine its position estimate as well as to build a three-dimensional image of its environment.

The radar used by the computer systems can see up to 160 meters ahead of the car, through sand, snow or fog. Radar is the primary sensor used to detect the vehicle’s surroundings, along with the front-facing cameras.

A 360-degree, ultrasonic sonar detects obstacles in an eight-meter radius around the car. The ultrasonic sensors can spot objects like a child or a dog, and work at any speed. This feature can also detect objects in blind spots and assist the car when automatically switching lanes.

The four forward-facing cameras serve as a backup to the radar. The cameras consist of a narrow camera that captures footage 250 meters in front, a main camera that captures 150 meters in front, a wide-angle camera that captures 60 meters in front, and a camera that captures footage 80 meters in front and to the side of the car. The wide-angle camera is designed to read road signs and traffic lights, allowing the car to react accordingly. A pair of rear cameras captures footage up to 100 meters to the rear and the rear sides of the car.

Once all the data around it is created, the on board computers form a constantly updating map of the environment around the car. This map contains buildings, sidewalks, pedestrians, cyclists and other cars.

Obstacles are categorised depending on how well they match up with a library of predetermined shape and motion descriptors. The vehicle uses a probabilistic model to track the predicted future path of moving objects based on its shape and prior trajectory. The previous, current and predicted future locations of all obstacles in the vehicle’s vicinity are incorporated into its internal map, which the vehicle then uses to plan its path.

The goal of path planning is to use the information captured in the vehicle’s map to safely direct the vehicle to its destination while avoiding obstacles and following the rules of the road. Though each company has a different algorithm to plan a path, all of them have some things in common.

The algorithm determines a rough long-range plan for the vehicle to follow while continuously refining a short-range plan(changing lanes, driving forward). It starts from a set of short-range paths that the vehicle would be capable of completing given its speed, direction, and position, and removes all those paths that would either cross an obstacle or come too close to the predicted path of a moving one. Once the best path has been identified, a set of throttle, brake and steering commands, are passed on to the vehicle’s on board processors which in turn pass the commands along to the respective motors/chips.

The road ahead

Car manufacturers have made significant advances in the past decade towards making self-driving cars a reality; however, there still remain a number of barriers that manufacturers must overcome before self-driving vehicles are safe enough for common road use. With the exception of first world countries, the traffic on roads around the world is too chaotic and unpredictable for autonomous cars. Computer vision systems have limitations to understanding road scenes and have yet to demonstrate the same capabilities as humans in their quick decision making and navigating unstructured environments skills.

In spite of all this, the future is very bright ahead. As showcased by Yandex(a Russian taxi company) at CES 2019, autonomous taxis can soon come into play.

The amount of road and traffic data available to these vehicles is increasing, newer range sensors are capturing more data, and the algorithms for interpreting road scenes are evolving. As technology improves, more driving tasks can be reliably outsourced to the vehicle.

Moreover, as more driver-less cars take to the streets, they may start communicating with each other in the form of a temporary network and make the roadway a much safer place then it is now.

Who knows, in the near future there may be a completely automated driving network like we have seen in sci-fi flicks like Minority Report or I, Robot.

A scene from Minority Report