»Control the perception stack«

Thoughts on Apple Car, Part 130

Reuters reports:

What is clear from Apple’s interest in cheaper lidar systems is that it wants to control the “perception stack” of sensors, computers and software to drive an autonomous vehicle, regardless of who makes the vehicle.

Makes sense, and follows the ‘control primary tech‘ mantra of the Cook era.

What makes the report special however, is the following:

“They’re not happy with most of what they see […] looking for a revolutionary design.”

Apple is of course looking to do things differently and better than what others have done so far. The highest regard goes to the design:

A third person familiar with the matter said Apple is seeking a “design-oriented” sensor that would be sleek and unobtrusive enough to fit into the overall lines of a vehicle.

That goes well with I expect Apple Car to be: A completely new form factor for a car that comes as a package. No arbitrary third-party elements to it, which can not be controlled in the way they look or behave.

This also goes for the sensors.

And it seems that former Apple people are working on just that:

What they are doing »doesn’t operate like any typical system. While other lidar systems send out pulses of light to measure distance, Aeva’s uses a continuous laser light to gather data more quickly.«

And more quickly is just what Apple needs: They are behind when it comes to testing autonomous systems in real-life scenarios. Tesla is leading in that regard.

But Tesla don’t believe in lidar.

During their already infamous Autonomy Day, Elon Musk said they only use it for SpaceX and have built those sensors on their own from scratch.


Seems to be exactly what Apple is doing at the moment.