Beyond the Cloud for Self-Driving Cars

You’re cruising down the street in your Tesla Autopilot. You were skeptical at first that a car could drive safely without your hand on the wheel. Now you love the freedom it gives you.

You close your eyes for a quick snooze. Maybe you flip on a movie.

Suddenly, you hear a loud crash. Up ahead there’s been an accident. Traffic screeches to a halt. Your car must make a quick decision: keep going, stop, or switch lanes?

Distributed Systems

I spoke with self-driving car engineers about this scenario at a recent visit to Stanford’s Volkwagen Automotive Innovation Lab (VAIL).

Real-time decision making is the principal challenge for developers of autonomous cars and drones. Just like human drivers, a self-driving vehicle must collect information about its environment, process it, and make a decision on the safest course of action.

A key question for autonomous vehicles involves “distributed systems.” Where will driving decisions be made — in the car itself, or in some remote control center? How much intelligence must be stored locally, and how much will reside in the cloud?

From the Cloud to the Edge

Today we live in a cloud-centric world of consumer technology. When you ask Siri a question, it’s routed to the cloud, where Apple’s servers process the question and send down an answer. Everything on your iPhone is a snapshot of what’s being done in the cloud.

This paradigm may be changing soon. Peter Levine of Andreesen Horowitz has talked about “the end of cloud computing.” The cloud won’t go away anytime soon. But emerging technologies that enable self-driving cars will require a shift in how we think about cloud computing. We’ll see the same shift in other new platforms like drones, home automation systems, and AR/VR systems.

Advances in sensors, storage, and machine learning / AI are driving a shift in computing from the cloud to the edge.

To understand what’s enabling these new platforms and the implications for the cloud, we need to understand 3 major trends:

  1. Real-world data
  2. Real-time decisions
  3. Machine learning

Real-world data

Self-driving cars will be less like traditional vehicles and more like giant computers that collect and process enormous amounts of data. As Levine noted, an autonomous vehicle may have over 200 connected computers inside of it: “A self-driving car is a data center on wheels.”

These moving data centers will rely on a multitude of sensors that collect information from the environment. In particular, next-generation computer vision will enable self-driving cars to collect real-world data as well as or better than humans.

I recently saw a demo of Intel’s latest 3D sensing and compute hardware. Their computer vision chips, less than 4mm thick, capture 600 frames per second and a 3D measure of distance. They generate a 360-degree depth map that can help a car navigate through traffic and avoid collisions.

We’re on the cusp of an explosion in real-world data collected by computer vision and other sensors. This will require new approaches to cloud computing and network infrastructure.

Real-time decisions

Self-driving car designers talk about two communication technologies, V2V (vehicle to vehicle) and V2I (vehicle to infrastructure), that will be used to enhance safety. These will require communication technologies that are capable of quickly, securely and reliably exchanging information.

Existing 4G LTE networks are fine for sending in-car infotainment and traffic information. The amount of data generated by self-driving cars will be exponentially bigger. A self-driving car’s computer vision system alone may generate over 10 GB of data per mile. LTE networks aren’t fast enough to send this volume of data to the cloud and process it in real time.

The fifth generation of mobile networks, 5G, may be capable of handling 5–10 GB per second. But rollout of these networks is years away and will require huge capex by network operators. And higher transmission rates will likely be outpaced by the explosion in data collected by new industrial applications like connected cars.

As sensors proliferate and the amount of data increases, cars will need to compute data locally and make decisions in real time.

Machine Learning

The real-world data collected by self-driving cars — images, sounds — will be vast and highly unstructured. The systems that process this data will need to be continually trained and refined.

This is where machine learning comes in. Autonomous cars will need machine learning algorithms to extract meaning from the huge amounts of unstructured data. These programs will enable a self-driving car to navigate unexpected roadblocks, traffic or weather.

Credit: a16z and Peter Levine

Get offa’ my cloud

What do these trends mean for the future of cloud computing?

In Levine’s vision, drones and self-driving cars will run machine algorithms locally, in real time — not in the cloud. Levine sees the cloud as a place for long-term learning and storage. The cloud will store curated data from the edge, learn from this data, and then push back the learnings to cars and other devices on the network edge.

The processing power of the edge will increase. It’ll follow the same trajectory as improved sensors and machine learning tools. Just as the iPhone seemed revolutionary 10 years ago, we’ll see a new generation of mind-blowingly powerful new mobile devices.

And when your Tesla Autopilot has to make a life-or-death decision, you’ll be able to trust that your car’s brain is smart enough to keep you safe.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.