An Autonomous Future

Where the roads are human error-free

Esha Saleem
9 min readJan 6, 2020

Self-driving cars are the natural extension of active safety and obviously something we should do.

- Elon Musk

Human Error is everywhere. People are just trying to get from point A to point B. The anomalies in between are commonly out of our control, you’d have to be perfect to avoid them; we are human after all.

25 million people are killed on the world’s roads every year. That’s 3,287 deaths a day. That’s an average of 3,287 too many. That’s not even including the millions of injured individuals.

It is almost 2020, and time to trust machines, to trust artificial intelligence; it is time to accept that human intelligence isn’t enough to avoid those anomalies. AI in cars can be programmed to avoid such anything fatal; to ensure we can all feel safe.

The beginning of the Vehicle

The first “car” ran for the very first time on New Year’s Eve. The year 1879 was a revolutionary year for transportation. After decades of evolution, the car found itself a hood and space for more passengers.

Benz Patent Motor Car: The first automobile (1885–1886)

By 2016, BMW achieved its goal of partial automation, where the driver may disengage from a few tasks whilst the vehicle may assist with steering and acceleration.

An Autonomous Vehicle

Autonomous → having the freedom to act independently

Vehicle → a thing used for transporting people or goods

So an autonomous vehicle is a form of transportation with a mind of its own. But how much control should a car have? And how much can it with today’s technology? There are multiple levels of control AI can have over a car; these are its levels of automation.

Levels of Automation:

0) No Automation

Let’s imagine that the car is a newborn baby, while the parent is the driver. In this case, the parent performs all the baby’s tasks for them.

When looking at an actual car the driver performs all tasks such as steering, brakes, accelerating, etc. This level is currently very common and is in most models.

1) Drive Assistance

When the baby grows a bit older, the parent would still be required to care for them (potty training and feeding) though the baby may crawl around on its own.

All BMW models have a driver's assistant which simply supports the vehicle but does not take control.

2) Partial Automation

Now that the child can walk, and is potty trained, the parents can start to relax a bit. They would still have to care for the baby, but they now trust that the little guy can move around safely.

This level of automation was achieved by BMW in 2016; the driver’s assistant can take control but the driver is still responsible for the vehicle and must pay attention.

3) Conditional Automation

Conditional automation is the biggest leap for autonomous vehicles where cars would go from level 2, being monitored by humans, to level 3, being monitored by the driver system.

Level 3 is more specifically allowing the driver to “disengage” from some functions such as breaking, but the driver must be attentive to intervene whenever possible. This tough jump of automation is aimed towards being achieved by 2021 by BMW.

4) High Automation

The vehicle is basically independent. It can steer, accelerate, monitor and brake when necessary. The vehicle is even supposed to be capable to turn, switch lanes, and respond to situations whenever needed. Though merging onto the highway and amidst traffic, it can get pretty tough. When level 3 of automation is achieved, BMW wishes to reach level 4 within the same year.

5) Full Automation

The dream, capable of controlling the vehicle on its own. This level of autonomous driving does not need any human attention. Instead, you can productively do other things while your trusty AV can get you to your destination. This dreamy of a future is close, and for BMW, only 4 years away (2023)!

BMW Vision iNext

Today, you literally have a choice to either “boost” or be at “ease” with BMW’s highly automated electric car. The BMW Vision iNext was designed to create an “intelligent step towards the future.”

The car has two simple modes; you would either choose Boost mode to drive it yourself or Ease mode for autonomy. Depending on the situation or your mood, you can decide if you are willing to place your trust in the future of AI.

The Concept Car — BMW Vision iNext

How an Autonomous Vehicle Works

A human would require all 5 senses when driving a car. Similarly, if a car had a mind of its own, it would have its own “senses.” Those are the 5 components that provide a car its ability to move autonomously, providing it with a safe form of control.


Under the category of perception, computer vision and sensor fusion pair up to make an unstoppable team, achieving a “perception” of the environment around the vehicle.

Computer Vision

When shopping you look through clothes judging it by appearance according to what your eyes see. In this case, your eyes perform a similar task. In an AV, camera images are used to figure out what the world around the vehicle looks like.

When there's a camera looking around the car, it should be able to recognize what the objects are, where they are, and the lanes as well. Image classification is a huge breakthrough; imagine an AI looking at an image of a dog being able to “classify” it as a dog. Now if we were to put that out onto the road, the machine would use inputs to know what a human, car, etc. looks like. Using that data, to then classify roads, signs, lights, cars, people and more.

Detecting cars with the use of perception. (Mohammad Atif Khan)

A CNN (Convolutional Neural Network) can be trained to recognize objects. It performs convolution operations on images in order to classify them. When an image captured by your camera is broken down into pixels then filtered black and white, the CNN takes over classifying through layers of code. Depending on the quality of the image, the more accurate output.

Sensor Fusion

When looking through clothes whilst shopping, you tend to feel through each of them using your sense of touch to judge each outfit. In an autonomous vehicle, the other “senses” are how we incorporate more data from other sensors—like laser and radar—to get a better understanding of the environment.

Waymo — Google’s Self-Driving Car

Waymo is Google’s self-driving car company (The image on the left.) The model includes what appears to be a weird spinny camera looking thing. That “thing” is a LiDAR sensor; it is commonly paired with other sensors to enhance a car’s ability to perceive the environment. A LiDAR sensor provides a car with 360-degree visibility at all times; it determines the distance of an object to the nearest ±2. The sensor allows us to form huge 3D maps that are very helpful especially in autonomous vehicles.

The results of a LiDAR sensor detecting objects around the vehicle.

The LiDAR sensor is basically a box mounted on the roof of a car for a good view. It is constantly spinning as it fires beams of laser light (millions of beams per second) while cruising through the streets. The device then measures how long the beam takes to return.

A formula used to measure the distance of an object from a LiDAR sensor.


Imagine looking through the aisles for the right shop that fits your liking, you look at the mall map to learn where you are, this is localization. A Global Positioning System (GPS) is a common way for vehicles to figure out where we are exactly.

Every autonomous vehicle is localized using 6DoF (6 Degrees of Freedom — Roll, Pitch, Yaw, and the x, y, z axes). This can allow the car to localize itself to the exact millimeter of its location. This can help avoid any accidental lane departure or when cars are cutting it off. The car can then figure out where it is, and decide where it wants to go from there.

Path Planning

Now that you know where you are in the mall, and where you want to be, the next step is to find the route to get you to where you want to be. This is path planning since you are literally planning your path similar to how a GPS charts a course through the world to get us where we like.

It is basically predicting a path for the vehicle to follow from multiple data from sensors. You can use path planning for ML applications, game bots or even physical robots.


To get anywhere we have to walk, turn, and stop when necessary; these are controls a human uses. A vehicle is similar but instead of walking, the car would accelerate. To control and reach your destination, you would turn the wheel and accelerate.

If a car were to follow a line and perhaps go astray by a bit, it would immediately turn a set distance to return to the line. If too far on the left, then turn right, and vice versa. This motion, called Bang-Bang Control would be quite jerky and inefficient. Though, with a range of angles a steering wheel can take, you can follow Proportional Control where the steering wheel steers harder the farther away from the line you are. To know the angle you would need to steer you would get the measurement called the Cross-Track Error which is the distance from your desired trajectory (in this case, the line).

A visualization of Proportional Control

Can we trust the system?

Possible Fatalities

So far AI appears to be perfect, finding all the best paths and recognizing everything around without error. Though a few years ago in Tempe, Elain Herzberg decided to take a walk. Whilst crossing the road, a self-driving Uber failed to act quickly. Legend has it that it was the first pedestrian death correlated with self-driving technology. The company suspended testing within Arizona along with other cities/areas including Toronto.

Ethical decisions

After an event like that, it is hard to place such trust in a machine that can’t make ethical decisions in certain situations. Suppose you’re driving and appear a jaywalking pedestrian in front, a car to your left, a school bus right behind you, and an elderly walking by the right. In order to avoid a collision, the driver must decide between multiple hard choices:

  • swerve left but risk hitting the car
  • swerve right and risk the innocent old lady
  • stop and risk the backside of your car hitting the bus
  • stay at the same pace continuing, risking the life of the jaywalker ahead

It is not as easy making quick decisions on the road as it is choosing your favorite flavor of ice cream. Though with all the latest improvements, ideas, and technology from 5G to Reinforcement Learning, a machine can potentially be better than an imperfect human. An autonomous vehicle removes the least reliable part of a car: the driver, therefore saving the world from Human Error.

Action Items:

  • Leave a clap (or more) on this article
  • For more, follow me on Medium (Clap those articles too)
  • Comment and leave feedback/thoughts :)

Further Reads: