Self-driving cars: will they really happen?

Wilbert Tabone
Published in
7 min readMar 6, 2023


Back in 2020, in the midst of the pandemic, I had the pleasure of sitting down (albeit virtually) with 16 renowned experts in the fields of Human Factors, and Design.

The topic of discussion? Life as a road user in the future urban environment. But not any road user — we are talking about vulnerable road users: pedestrians, cyclists, scooters, and the like. It seems the roads are not as safe as we think. In fact, the World Health Organisation points out that there are over 1.3 million traffic-related deaths each year, with more than half being vulnerable road users! The utopian dream is to reduce most fatalities by automating traffic.

Photo by Alessio Lin on Unsplash

I was curious to know about when self-driving cars would be expected to hit the urban roads, when we expect to see some form of smart infrastructure around us, and how are we to interact with these robot vehicles?

When do we expect self-driving cars to hit the urban environment?

Well, in a way they are already driving around our cities, but most still have safety stewards operating in the car, or remotely. The self-driving cars we see in movies, that is those which can self-drive under every condition, come sunshine or rain, are quite far off. In fact the consensus amongst the interviewed experts was that it will take decades for SAE Level 4 vehicles (which operate in driver-less conditions in particular geo-fenced regions) to be introduced, while Level 5 (which operate everywhere and in each condition) vehicles are a long way off or even an impossibility. To be fair, some Level 4 transport shuttles have already been introduced, but they are still accompanied by a safety steward. Moreover, some Level 5 vehicles are also available, but only in an industrial setting.

Summary table for the 5 levels of automation, as defined by the Society of Automotive Engineers (SAE). The graphic is © 2021 SAE International, use with acknowledgement.

The main challenges to achieve Level 5 in the urban environment remains to be the assurance that the vehicles will be capable of engaging in social interaction in mixed traffic, drive in particular road types and weather
conditions, and anticipate the behaviour of pedestrians, cyclists, and so on. In urban environments, some of the experts worry that pedestrian heavy areas will become more susceptible to jaywalking due to the pedestrians’ expectation that the self-driving cars will always stop for them. In that case, the self-driving cars would be rendered completely immobile or make them drive very slowly under caution. Hence, it would be difficult to introduce these vehicles without changing the existing infrastructure, behavior of the pedestrians, or segregating vulnerable road users from the self-driving cars; but it was noted that such solutions would be costly and resource-intensive.

What about smart cities and infrastructure?

The enhancement of the infrastructure would see the inclusion of
smart infrastructure in urban environments. Its potential was recognised by many of the experts despite different interpretations of what it entails. Smart infrastructure could mean infrastructure which receives information from all the self-driving cars about their individual states, and communicates to the vulnerable road users wirelessly, or it may provide feedback to the vulnerable road users via the road surface (eg., asphalt which lights up in certain places), smart traffic lights, smartphones, and cloud systems.

Ok maybe too futuristic, but you get the idea. Cars driving by themselves on ‘smart’ roads. Image hallucinated by Midjourney.

I know what you’re thinking. All of this sounds very expensive. You’re right! In fact, while the experts acknowledged that the use of smart infrastructure would be beneficial to self-driving capabilities, as it would increase the perceptual capabilities in pedestrian-heavy areas, it was argued that there are major concerns about cost, maintenance, and the reliability of wireless communication of smart infrastructure. For these reasons, several researchers were not too keen about the use of smart infrastructures, especially for lower income countries.

Instead, the experts saw greater potential in self-driving cars which act independently from the infrastructure, and recommended further investment in that direction. One such way would be to have some sort of interface which allows the car to communicate to other road users. So, we will now look at what these interfaces are.

How would we communicate with such vehicles?

With the lack of an attentive driver, things could get rather complicated. Would you cross in front of a car whose driver is sleeping at the wheel? Probably not. What’s missing here is the explicit communication from the driver, that is, there is no one to make eye contact with, or to give you a friendly wave of a their hands to signal that you may cross.

In attempt to solve this, both researchers and industry have been experimenting with adding some attachments to self-driving cars. We call these external human-machine interfaces, or eHMIs for short. Think projections emitting from the car to the surface of the road, LED strips around the body of the car which change colour depending on the intentions of the car, LED screens attached on top of the front number plate which display text indicating whether its ok to cross or not, and also robotic arms attached to the top of the car which will signal to you to go ahead. Some even suggested having robot eyes which follow you as you cross instead of the traditional headlamps.

Example of LED strip lights around a self-driving car. Image hallucinated by Midjourney.

Perhaps you can already infer some of the problems here. Too many approaches. What if each car manufacturer goes their own way? What if the text displayed on the car is in Japanese, but you are a French tourist visiting Tokyo? What if there is a group of pedestrians? Who would the message be for?

On solving some problems with Augmented Reality technology

One possible solution to the above problems would be to use Augmented Reality (AR) technology, which would allow pedestrians to receive individualised communication on their wearable device such as smart glasses, which are set to replace the smartphone in the future. By individualised communication, I mean a design which you would understand, with any text presented in your preferred language. Also, if you are in a group of people, you will only see what is relevant to you.

The experts pointed out that AR technology has been successfully used in other industries, such as manufacturing and agriculture, as well as in smartphones. Possibilities for using AR in traffic included the mention of holographic traffic lights and signs, the removal of irrelevant information in the world, an indicator nudging the user’s attention towards an AV, the projection of safety zones or coloured road surfaces, or a fence or barrier indicating that one should not cross.

This person probably got the go ahead to cross, since they are in the middle of the road. Image hallucinated by Midjourney.

However, despite the ideas presented, the experts were generally critical towards AR in future traffic, mentioning challenges of privacy, invasiveness, user-friendliness, technological feasibility, and inclusiveness (that is, not everybody having access to such devices). Some even raised concerns regarding its use for collision avoidance since pedestrians would still look at the movement of the car to judge whether it is safe to cross.

Yet despite the criticism, the experts concurred that AR could resolve the one‐to‐many problem in eHMIs, or resolve language barriers by providing person-specific feedback. The experts also saw AR as a useful secondary cue to the movement of the car, especially since not everyone would be expected to wear smart glasses. One more liberal though suggested that similar to automation is being developed to offload the driver in a car to allow them to engage in non-driving activities, so to can similar support be developed for vulnerable road users to allow them to consume media and be informed about traffic via AR.

Wrapping Up

While self-driving cars are often overhyped in terms of current capabilities, it is worth knowing that better systems are coming. When, we do not know. How they will look, is also unknown. But that is the beauty of ongoing research. The next level of vehicles, and the way that they will communicate with other road users is still in the works. What experts tell us is that we will have to wait a bit longer to get that magic car that can do everything on its own. So unfortunately, you will have to wait longer for the day you can Netflix and ‘drive’.

In the meantime, it would be interesting to see which direction things will go: towards segregated roads, smart roads, or vehicles with some sort of interface, or communication on our smart glasses. Whatever option is picked, what is sure is that the way we interact in the future urban environment will change.

Further Reading

This article gives a brief overview of the outcomes from the conducted interviews. For more in-depth information, you are welcomed to read the open access journal article which was published following the study:

Follow me for the latest: @WilbertTabone


Tabone, W., De Winter, J. C. F., Ackermann, C., Bärgman, J., Baumann, M., Deb, S., Emmenegger, C., Habibovic, A., Hagenzieker, M., Hancock, P. A., Happee, R., Krems, J., Lee, J. D., Martens, M., Merat, N., Norman, D. A., Sheridan, T. B., & Stanton, N. A. (2021). Vulnerable road users and the coming wave of automated vehicles: Expert perspectives. Transportation Research Interdisciplinary Perspectives, 9, 100293.

This work forms part of my PhD research, which is supported by the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement № 860410.



Wilbert Tabone

Human-Robot Interaction PhD candidate with a background in AI and a passion for culture and art. Working on AR for automated vehicles. #VR #AR #AI #UX #HCI