A²I : To drive or to be driven?

A pathway to accelerate the pace & steer into the future-ready technology by providing a breakthrough in innovation!

Varshita Murthy
Arnekt-AI
6 min readAug 27, 2018

--

Image Courtesy : “The vehicles will be fully self-driving. So you have your own personal space where you can sit back and relax. “~John Krafcik

In the past few years, autonomous driving has gone from “unimaginable” to “may be possible” to “definitely possible” to “inexorable” to “how did anyone ever think this wasn’t inevitable?”

Before getting into the details of the automation and its aesthetics. Let’s understand what is an autonomous car? An autonomous car is a vehicle that uses a combination of sensors, cameras, radar and Artificial Intelligence (AI) to travel between destinations without a human intervention.

If the roads were mostly occupied by autonomous cars, traffic would flow smoothly and there would be less traffic congestion. In cars that are fully automated, the occupants could do productive activities while commuting to work. People who aren’t able to drive due to physical limitations could find new independence through autonomous vehicles.

Innovation behind the Automation

The history of driverless cars goes back much further than that. Leonardo da Vinci designed the first prototype around 1478. Leonardo’s car was designed as a self-propelled robot powered by springs, with programmable steering and the ability to run preset courses.

One of the vintage example is of Stanford Cart which was built in 1961, it could navigate around obstacles using cameras and was an early version of artificial intelligence by the early 70s.

Image Courtesy : Stanford Cart

Back then in 1989 ,a group of researchers who ran the self-driving car project that gave birth to ALVINN, which stands for Autonomous Land Vehicle In a Neural Network ,which is a proto-driverless vehicle. The approach ALVINN took was using a neural network to drive the car, which was absolutely groundbreaking for the time and quickly become an increasingly popular approach with self-driving car efforts. While Google’s self-driving cars rely on 3D maps to situate itself in its environment, ALVINN’s use of a neural network which meant the vehicle can make decisions without the need for a map.

The roadway towards autonomous cars began with incremental automation features for safety and convenience before the year 2000, with cruise control and anti-lock brakes. After the turn of the millennium, advanced safety features including electronic stability control, blind spot detection and collision and lane shift warnings became available in vehicles. Between 2010 to 2016, advanced driver assistance capabilities such as rear-view video cameras, automatic emergency brakes and lane centering assistance emerged, according to the NHTSA. Since 2016, automation has moved toward partial autonomy, with features that help drivers stay in their lane, along with adaptive cruise control technology, and the ability to self-park.

Image Courtesy : The empty cockpit of an autonomous car

Sensors are already used to map the terrain for advanced driver assistance systems(ADAS) equipped vehicles, but AI will build workable solutions for better safety, more convenience, and energy efficiency and better precision. However, a key challenge is to define and develop models to find correlations between available physical signals, existing or to-be-developed AI scenarios, deep-learning models, and the real-world decision impact in a real traffic situation.

As of 2018, car makers have reached Level 3. According to the NHTSA, self-driving vehicles are at least a few years off because manufacturers must clear a variety of technological milestones and a number of important issues must be addressed before autonomous vehicles can be purchased and used on public roads.

As the industry looks to machine learning as the basis for autonomous systems, artificial intelligence could be the next big path-breaker.

AI-powered self-driving car systems

With the advent of new technology, changing expectations for the motor travel experience and massive shifts in the regulatory and trade environment, the automotive industry makes use of AI which enables vehicles to identify, manage, make sense of, and respond quickly to real-world data inputs from hundreds of different sensors within a short period.

“Self-driving vehicles, automatically choosing the most efficient route… Artificial Intelligence will dramatically improve logistics.” ~Dave Waters

Developers of self-driving cars use huge amounts of data from image recognition systems, along with and neural networks and machine learning, to build systems that can drive independently. The neural networks identify patterns in the data which includes images from cameras on self-driving cars to identify as traffic lights, trees, curbs, pedestrians, street signs and other parts of any given driving environment, which is later fed to the machine learning algorithms.

Video Courtesy : Drive into the forthcoming :)

An archetype of self-driving car development technology is Google’s Waymo. The name Waymo is extracted from its mission, “a new way forward in mobility”. Waymo envisions a future where there are fewer accidents caused by distracted, impaired and generally fallible human drivers, as well as reduced traffic and greater general ease in getting around.

In a recent blog post, Waymo also detailed that its vehicles use machine learning to identify and respond to emergency vehicles or pull off tricky driving maneuvers.

Here is the brief explanation of the working of the Google’s Waymo :

  • The passenger sets a destination. The car’s software computes the best route possible to reach the destination within less time.
  • A rotating, roof-mounted LIDAR sensor monitors a 60-meter range around the car and creates a dynamic 3D map of the current environment of the car.
  • A sensor on the left rear wheel examines sideways movement to predict the car’s position relative to the 3D map.
  • Radar systems in the front and rear bumpers calculate distances from obstacles.
  • All the sensors in the car are connected to AI software which aids in gathering input from Google Street View and video cameras inside the car.
  • The human perceptual and decision-making processes are powered by AI technologies which control movements in driver-control systems such as steering and brakes.
  • The car’s software seeks information from Google Maps for advance notice of things like landmarks,traffic signs & lights.
  • A new feature Override is also available to allow a human to take control of the vehicle at any given time.
Making an extra-ordinary ride feel completely ordinary!

Fueling your mind for forthcoming !

It’s a captivating time for the motor industry and the ever-shifting landscape of autonomous technology but any future mass roll-out of autonomous cars remains in the realm of the large-scale manufacturers. One thing that does appear to be certain is that AI will play a paramount role in the development of Level 4 and Level 5 autonomous cars, which will help to reduce accidents and improve mobility. The race is on to produce and demonstrate the most viable, safe and robust systems. Automakers are not only developing complex systems that allow cars to drive themselves, but also furthering existing technologies such as self-parking and pre-safe systems.

R&D at Arnekt

Arnekt helps in turning the vision into action. Arnekt envisions development of AI-powered applications and web services using Cognitive Intelligence. Arnekt mainly focuses on building data-driven algorithms to improve logistics for businesses without compromising on the performance which helps in making greater impact as a leader.This helps in giving a competitive edge and creating business value.

--

--

Varshita Murthy
Arnekt-AI

Engineer | AI neophyte | Nature enthusiast | Idea hamster | Thalassophile