2017 was an eventful year for robotic vehicles, so it’s a topic I figured I should follow up with a longer, dedicated set of notes. After helping bootstrap one still stealth-mode company in a related area during 2016, I took half a step away from the immediate space into the related field of Mobility as a Service, including both trunk route services as well as on-demand ride-hailing and -sharing services, obviously with human drivers involved. With Kyyti, we’re operating in customer mode already, which works better for me than the still mostly lab-mode work of the vehicle technology itself. Still, robotic vehicles are immensely interesting and I’ve been keeping an eye on the field. Here goes!

Starting with the big boys — Waymo, the formerly GoogleX project, has announced they will begin to serve customers in Phoenix, Arizona during 2018. This team has been working on autonomous driving technology for perhaps the longest time and made decent technological progress. Still, for quite a while externally visible milestones were fairly rare, and the business model, or what we outside know of it, has been in the air. Early on I thought it would be an extension of Google’s — that is, the vehicle is a tool which essentially replicates the cost-per-action advertising model of Google’s online services with a vehicle that takes customers to commercial locations like malls and restaurants. That’s obviously somewhat limited scope in relation to all possible travel needs, but then again, Google offers also services like Gmail to consumers without immediate revenue streams. More recently, Waymo has made noises pointing at possibly being a technology stack for vehicle OEMs instead, ie in Android style. Still somewhat difficult to see car makers get excited about handing over their business to Google/Alphabet, though.

Which takes us to the OEMs. GM, with its subsidiary Cruise, has similarly promised to begin taxi-like operations with automated Chevy Bolt vehicles by 2019. They must be fairly confident of their technology working in order to do this, though it’s not clear to me whether they plan to have drivers as in-vehicle back-ups for this service. If they do, that removes the economic benefit of the technology almost totally, but they may see this as a necessary stepping stone.

Tesla was the other early leader in at least semi-autonomy, and having made the early and at the time ingenious leap to deploying sensors to delivered customer vehicles, still has an immense lead in the amount of recorded road-distance as training data for their technology. On the other hand, they’ve changed processing hardware a couple of times, and while Elon Musk continues to promise the technology they are currently installing to the new Model 3 devices will be software-upgradable to full autonomy, experts remain dubious.

Which introduces us to a sidetrack: Tesla’s approach continues to be based on cameras and radars, while almost everyone else depends on lidars. Cameras you probably already know about. Except that the vehicular cameras usually include infrared channels, because it turns out that there are temperature differences in road traffic. Who would have thought? Radars you probably also have at least a vague idea about; they are relatively cheap, low-resolution tools for measuring distances to objects which reflect radio waves. Lidars, on the other hand, are almost the same as radar, with the exception that they’re based on (infrared) lasers instead of radio, which means, due to the tight laser beam, that there must also be hardware to scatter the beam. This has been mechanical (rotating mirrors), thus large, expensive and failure-prone. This year several solid-state lidar devices entered prototype phase and samples are shipping, or will ship in 2018, to users. Lidars will become far cheaper very soon. Whether that makes them better than cameras and visual analysis, is still up in the air, though. Personally, I’d still bet on cameras, because lidars suffer from both range and weak performance in direct sunlight. Whether lidar or camera, the processing is very compute-intensive, and it’s unlikely any current generation hardware is up to the requirements. Nvidia has been the leading supplier of compute platforms for these vehicles, though the very latest news have been that Tesla is in fact about to start designing their own chips for the purpose.

But back to the road and vehicles. Toyota dragged their feet for a long time, but then they hired Gill Pratt and put up a very well funded research unit up.

The rest of the field of car companies, that is at least Ford and Nissan seem to be working with partner companies to find themselves the software chops to do autonomy. Fiat-Chrysler and BMW have hooked up with Intel. Honda is a bit of a mystery to me — they seem to be working on in-house tech, as well as with Waymo, but are years behind others. Volvo’s Chinese owners may have some technology of their own, but they’re also working with Uber. Volkswagen, despite promising autonomous vehicles by 2021, and Daimler are the least transparent of the bunch — my guess is they will ultimately end up buying their solutions from either Intel, Bosch, or one of the startups.

Before we get to the startups, there’s that Uber. This financial analysis from a year ago laid bare why Uber will need to get autonomous vehicles rolling and displace their drivers in order to capture enough value to turn a profit. That explains why they’ve been pushing their own robotic vehicle development so hard — possibly too hard, but that’s up to courts to decide — the trial has been delayed but will be starting early 2018 and will certainly be one to follow — not least because the judge in the case, honorable William Alsup is probably the most technologically clued-in person in the whole US federal judiciary. Edit 30th Dec: It passed my notice in the summer that Uber formed a joint venture with Yandex (Russia), but this was approved in November and is definitely worth a note, as both companies have had autonomy under development and are sure to now share resources.

And then the startups. These are still too numerous to list. Nutonomy, Drive.ai, Comma.ai, AImotive, Sensible4, research teams all over the universities. It’s hard to see any of these remain independent, and I’m sure many are envious of the early marriage of Cruise with GM. As far as I can tell, all of them are playing the same “hook cameras to a drive-by-wire vehicle like a used Prius, put an Nvidia DRIVE PX computer in the trunk and try to get a deep learning algorithm to recognize and steer the car”. Getting this to a proof of concept isn’t difficult any more — but getting all the way to a reliable system is very, very hard. I would still bet on a software-focused startup rather than a manufacturer with a legacy in dinosaur-burning machinery to get there first.

Cars is where the visible development action is, but they’re not where autonomous driving will make the greatest impact. That’s going to be in professional fleets of larger vehicles. Delivery trucks (with both DHL and UPS working on their own), long-haul trucks (where platooning, or several semi-autonomous trucks following one human-driven lead vehicle in a close formation, has been already interoperability tested on at least EU highways), buses and shuttles are where significant operational savings are to be had by either reducing the number of drivers, or by keeping vehicles moving even during legally mandated driver break-times. The professional fleets will also be easier able to deliver and benefit from having human oversight provided from remote locations — thus much more pliable for implementing Level 4 autonomy, where the vehicles are able to maintain operations under most conditions, safely pull over when not and call for human attention to get themselves out of trouble.

The above is also an economic justification to why the still missing remote control standards for autonomic vehicles may become a regulatory necessity before large scale adoption. Despite the obvious safety questions of remote takeover, vehicles will need to have the capability to be integrated to cross-OEM fleets, be managed and orchestrated for congestion management and diversion routing as well as yield for rescue and police traffic. All of these will require receiving instructions remotely over a secure channel — and direct remote driving (even if only at very restricted speeds) is a simple extension after. At least Nissan has laid out their vision as one which integrates remote control from the beginning, and in this they are right, I believe.

It may be that the benefits for private cars will require Level 5 autonomy of full automatic operation under all conditions, a challenge that especially in mixed-use city streets is several orders of magnitude more difficult to reach — and by the time technology is able to get there, maybe the trend has already moved to mobility services instead of private car ownership. Cities can not handle more cars, private or ride-hailing, which leads to the need of effective use of shared capacity. Whether with human or automated driving, one-passenger vehicles will need to be replaced with solutions that utilize the streets better and move more people.