Gathering data for autonomous driving in adverse weather conditions

Anyverse
Anyverse™
Published in
5 min readNov 10, 2022

Autonomous driving is fundamentally changing the way people and goods are transported and could benefit the future society in significant ways: fewer traffic accidents and fatalities, lessen energy consumption and air pollution, increase access to transportation for people with limited reliable mobility options… However, accidents involving autonomous vehicles are still rising.

I believe we all agree that the self-driving market needs to reach the level of development where a driverless vehicle is 100% safe (and perceived as such by society) and reliable to operate like another manned vehicle in the vastness of the real world. In order to achieve this, one of the biggest challenges must be appropriately dealt with: the performance of autonomous driving in adverse weather conditions. How are you going to gather the data to achieve this?

Negative influences of adverse weather on traffic and transportation

Numbers speak for themselves. The risk of accidents in rain conditions is 70% higher than normal [1]. Taking the US national statistics as an example, each year over 30,000 vehicle crashes occur on snowy or icy roads or during snowfall or sleet, so the threat from snow is quite real. Fog, haze, sandstorms, and strong light severely decrease visibility and the difficulties they cause in driving are palpable. To all these problems, we have to add others directly or circumstantially caused by weather: heat & coldness, contamination, or damage to vehicle hardware… They’ll have unpredictable or undesirable effects on both traditional and autonomous transportation as well.

Rapid AV development, but reliability issues remain in bad weather

Currently, there are many autonomous driving projects either in trial stage or in operation all over the world. Some manufacturers claim to have achieved or are about to deliver vehicles with autonomy equivalent to level 4 of SAE standard [2] like Waymo’s commercial self-driving taxi service in Phoenix, Arizona [3].

However, despite the gigantic advances in recent years, there is a common issue for all the current driverless vehicles. They barely operate during heavy rain or snow due to safety issues.

Reality is that adverse weather conditions are restraining human drivers to the steering wheel and autonomous vehicles can’t be fully trusted to work alone yet. Therefore, for autonomous driving systems to continue to evolve trustworthily and safely for humans, the problem of reliability in adverse weather conditions must be addressed.

Gathering synthetic data to accurately simulate adverse weather conditions

Gathering the right data is key. Autonomous driving research and development couldn’t be done without data. Many features used in object detection tasks need to be extracted from datasets and almost every algorithm needs to be tested and validated on datasets. As you can guess, simulating adverse weather conditions is not an exception. Having enough weather conditions data covering each kind of weather is essential.

Unfortunately, most of the datasets commonly used for development do not contain too many conditions different from clear weather. On the other hand, synthetically generated datasets open a new set of simulation options that can successfully fill this gap.

With its hyperspectral synthetic data platform, Anyverse is already tackling this data lack. Its data solution for autonomous driving AI development allows developers to simulate any weather conditions and apply them to their validation-training-testing specific scenes.

Adverse weather influence on camera sensors

There are several types of sensors being used in autonomous vehicles, but cameras are still the one element that are absolutely not replaceable, while it’s also one of the most vulnerable in adverse weather conditions.

To better demonstrate the influences of some adverse weather conditions on cameras and other sensors, you can take a look at the following comparison [4].

Table 1 — The influence level of various weather conditions on sensors [4]

How can Anyverse help understand the world under extreme conditions?

Being conscious of these facts, Anyverse has taken two approaches to help develop camera-based perception systems for adverse weather conditions.

First develop a faithful simulation of different weather conditions: Rain, fog, snow, and extreme light. All these can reach levels that completely “blind” the camera sensors [4]. However until you reach that point of no return you need to train your system to deal with those conditions, and that is what the Anyverse Platform weather simulation can help with. We have partnered with the National University of Ireland to validate a new rain and fog model and we are working now to extrapolate those results to our snow model. Our physically correct hyperspectral light simulation and material characterization allow us to simulate the different effects of light on the different materials, generate faithful reflections, and handle direct light on the camera.

Second, we have developed Anyverse’s camera sensor simulation pipeline, capable of simulating any camera sensor. This means, that using the hyperspectral input from our render you can simulate very faithfully what the sensor will perceive from the real world.

Putting these two approaches together, you can create synthetic data that will take your perception system to the limit and get it ready to understand the world under extreme weather before it reaches the physical limit of blindness. Beyond that point, the use of other perception systems based on other sensors are critical for the overall goal of fully autonomous driving.

References

[1] Jean Andrey and Sam Yagar. “A temporal analysis of rain-related crash risk”. In: Accident Analysis & Prevention 25.4 (1993), pp. 465–472.

[2] Automated Driving. “Levels of driving automation are defined in new SAE international standard J3016: 2014”. In: SAE International: Warrendale, PA, USA 1 (2014).

[3] Michael Laris. “Transportation Waymo launches nation’s first commercial self-driving taxi service in Arizona”. In: Washington Post 6 (2018), p. 2018.

[4] Yuxiao Zhang, Alexander Carballo, Hanting Yang, Kazuya Takeda. “Autonomous Driving in Adverse Weather Conditions: A Survey” (2021) p.7 arXiv:2112.08936

About Anyverse™

Anyverse™ helps you continuously improve your deep learning perception models to reduce your system’s time to market applying new software 2.0 processes. Our synthetic data production platform allows us to provide high-fidelity accurate and balanced datasets. Along with a data-driven iterative process, we can help you reach the required model performance.

With Anyverse™, you can accurately simulate any camera sensor and help you decide which one will perform better with your perception system. No more complex and expensive experiments with real devices, thanks to our state-of-the-art photometric pipeline.

Need to know more?

Visit our website, anyverse.ai anytime, or our Linkedin, Instagram, and Twitter profiles.

--

--

Anyverse
Anyverse™

The hyperspectral synthetic data platform for advanced perception