New Age of Auto Tech: Intro and In-Cabin Monitoring

David Rubino
5 min readAug 12, 2020

--

https://turo.com/blog/gearheads/5-high-tech-features-coming-to-more-cars
Boy, that driver sure is going fast

Up until very recently, there has been much ink spilt about the seemingly “just around the corner” world of autonomous driving. Ever since DARPA announced their Grand Challenge back in the halcyon days of 2002, there’s been talk that we’re but years away.

Never mind that in that 2004 challenge which came out of that announcement, none of the vehicles actually completed the course; there was a sense that even then, the projected mandate of autonomous vehicles [for military purposes] by 2015 was simply too slow. By the time that deadline came and went, CEOs were proclaiming that we were only 2 years away from self-driving cars, 2016 had yet more momentum for the space with autonomous vehicles “winning” CES, and 2017 was poised to be even bigger.

Things started to cool off considerably by 2018, following several high profile crashes, increasingly bald acknowledgement of the computing challenges face, and less than stellar public perception. As it stands now in 2020, the excitement has tapered off from once we were. Gartner most recently had self-driving cars both at the bottom of their hype cycle, and pegged as at least 10 years out.

Gartner: Hype Cycle For AI, 2019

While there is certainly interest in the space, as evinced by enormous rounds like Aurora’s $600M Series B, and the bevy of OEMs who lay claim to some degree of autonomy, the autonomous driving world isn’t quite at those 2002 levels of hype — and of course this was made no easier by the pandemic forcing test vehicles off the roads.

So where does automotive tech go from here? I’ve identified several areas where I believe an investor should be looking in this new age of automotive tech. For this first post, I’ll be getting into:

Interior Cabin Sensing

One of the key promises of self-driving cars is that drivers will simply be able to effectively pass out behind the wheel — just plug in the destination, and let it rip! Sounds great, but the reality, of course, isn’t quite so binary. There’s generally acknowledged to be 6 levels of autonomy:

Synopsys: The 6 Levels of Vehicle Autonomy Explained

If you’ve a car which can maintain cruise control, you might have what could be considered a car with level 1 automation. Cars which came out a few years later might have some gentle control and steering autonomy for level 2, and so on. Then there might be the need for the occasional human override at level 3, some increasingly rare human override scenarios at 4, and then level 5, well, good night wake me when we get there.

Yet because the automotive industry isn’t about to — or rather, can’t and shouldn’t — launch right into level 5 autonomy, you’ll notice one consistent factor: the need for human intervention at various points. If cars with broad exposure to level 2 and 3 are on the road now, it’ll be several years of handoffs between humans and car systems until level 4 arrives en masse, perhaps decades. Even there, there will be the need for handoffs, just less common, and to state the obvious, commuters will not all uniformly get the newest level of tech right away.

But how will cars know when to facilitate these handoffs? They’ve a whole suite of sensors looking out, but they should spend more time looking in. Specifically, sensors which can assess driver attention, wakefulness, gaze direction, distraction levels; these will all prove increasingly necessary as manufacturers try to answer the question of when these events should take place.

Even outside of autonomy, with laws like Italy’s which require awareness of whether kids have been left in a car, there are a lot of applications here. Airbags can be deployed most effectively based on positions of cabin inhabitants. A lot to be done, and don’t take my word for it; let’s look at how often “driver monitoring” came up on calls over the last decade:

https://www.cbinsights.com/research/report/in-vehicle-experience-technology-future/
CBInsights: In-Vehicle Experience

A lot of applications, a lot of attention, and a lot of ways companies are looking to tackle the problem.

Eyeris, for instance, offers surface classification, human behavioral understanding, and object localization for their solution, alongside claims of the world’s largest in-vehicle dataset.

Affectiva relies on their understanding of complex emotional and cognitive states to make judgement calls for the cabin, and offers ways to remedy situations like drowsiness by, for instance, finding the nearest coffee shop.

Vayyar believes that in-cabin MiMo (multiple input multiple output) radar offers a way to simultaneously monitor the whole cabin on the voxel level for less than the cost of a high-performance stereo camera.

Eyesight uses IR sensing alongside a software suite to assess pupil dilation, blink rate, and other metrics from the target driver, on top of touch-free gesture control for infotainment.

Of course, there are also major players in the space as well, from TI to Lumentum to Rivian, but I believe there’s room for companies in the startup world to thrive with particularly lightweight, adaptive solutions.

Which in particular? Well, keep your eyes on this page, as I’ll be doing a more in-depth dive into the in-cabin monitoring space, all of the attendant companies, and technologies (ToF, VSCEL, Radar, IR, LED, mmWave, etc) in a later post. For now, I’d keep this area in mind as you think about which domains of automotive tech you should be looking at.

Well, that’s about it for this one. Watch for the next, when I’ll be giving some thoughts on wireless power transfer and their potential impact on wiring harnesses in cars. As ever, I can be found at david@gvc.partners should anyone want to discuss further.

--

--