On July 30, a fleet of self-driving Nissan NV200 minivans owned by Drive.ai, a small company founded in 2015 by students of Stanford University’s Artificial Intelligence Lab began carrying passengers in Frisco, a city of 175,000 inhabitants about 40 km from Dallas known for its growing ties to the technology sector and where Uber Elevate, Uber’s flying taxi division, is located.
The Drive.ai pilot scheme will last six months and aims to 10,000 people within Frisco city limits that include a number of highly congested routes passing by shopping centers, office buildings and sports venues, in a bid to establish the positive impact of driverless vehicles on reducing traffic jams.
Aside from the fact that the initiative is not being carried out by self-driving giants like Waymo or Cruise, is Drive.ai’s approach to communicating with pedestrians and other cars, an often overlooked aspect. Autonomous vehicles typically see pedestrians as entities with difficult to anticipate behavior. From the pedestrian’s perspective, a car is a box with a person inside making decisions with whom it is possible to communicate, usually by a glance: a driver can indicate with a gesture for us to cross. But when the vehicle is autonomous, the relationship with pedestrians is managed entirely by a set of algorithms that try to anticipate their decisions. There can be no communication as such.
The public’s attitude toward autonomous vehicles are important, and all the more so after the fatal accident last March in Tempe (Arizona) involving an Uber vehicle. In response, Drive.ai’s vehicles are brightly colored in orange, immediately discernible in city traffic, bringing to mind the associations with safety of a school bus.
To improve communication with pedestrians and other drivers, Drive.ai has designed a system using four external screens: one on each side of the vehicle, one in front and one behind, to show messages or combinations of messages informing other road users as to whether the vehicle is waiting for a pedestrian to cross, about to move forward or backward, or if passengers are getting in or out, and if it is in manual mode. For example, when the vehicle stops at a zebra crossing, it will display “Waiting for you to cross” on the front and side screens, and “Crossing” at the back, along with an image of a pedestrian crossing.
Drive.ai’s idea is clearly to make its vehicles seem more “friendly”, making clear theirs intentions at all times. This is a car that provides feedback beyond brake lights or indicator, unlike those being tested by Waymo, Uber and others, but these are features that could be incorporated if they are seen to work.
As mentioned a few weeks ago, there is more going on in the field of autonomous driving than just what the big companies are up to, and some of these other initiatives offer interesting visions and approaches. It is still too early to know how autonomous transportation will work: whether users indicate a point of origin and destination, or if we’ll see alternatives such as supplementing public transport by taking passengers to bus stops, or if there’ll be some kind of intermediate solution or mixed approaches depending on demand.
Either way, self-driving technology is increasingly mature and has moved beyond simply transporting people safely and is now focusing on communication with other road users and which transportation models will work best. Recent developments show that self-driving vehicles will be on our roads much sooner than thought even a couple of years ago, while regional and local governments are taking an increasingly proactive attitude as they encourage tests on the roads of our cities.
(En español, aquí)