If We Want To Get To Real Time AI, We’ve Got To Build Another iPhone Industry. Five Times Over.

azeem
10 min readJul 17, 2017

Produced in partnership with NewCo Shift.

In October 2016, Tesla announced a significant change to its Advanced Driver Assistance System package. This is the combination of sensors and computer power that will enable Tesla to fulfill Elon Musk’s promise to drive “all the way from a parking lot in California to a parking lot in New York with no controls touched in the entire journey” by the end of 2017.

Amongst the many changes to the sensor package was a switch in the systems’ brains. Previously powered by a processor from Mobileye (recently acquired by Intel) the package now sports a Nvidia Drive PX 2. Why?

It turns out that to be safe, self-driving cars need an extraordinary amount of data from sensor systems. And if it is to figure out what all those sensors are telling it, the car requires an unprecedented amount of processing. Once it knows what is going on in the environment, yet more processing is needed to help the car figure out what to do next.

The switch that Tesla made gives a clue to just how much processing. The Mobileye EyeQ3 processor was a significant chip. It was 42mm² in area (about a quarter of the size of a modern Intel i7 processor), packing transistors using a manufacturing process…

--

--

azeem

Entrepreneur, inventor and creator — curator of The Exponential View