Scenario-based simulation: Combining HD maps and real-world traffic data

atlatec Gmbh
atlatec Gmbh
Published in
6 min readDec 9, 2020
atlatec autonomous driving

If you work in the ADAS/Autonomous Vehicles field, you are probably familiar with HD maps — virtual recreations of real-world roads including their 3D profile, driving rules, inter-connectivity of lanes etc.

A lot of these HD maps go into the simulation domain, where car makers and suppliers leverage them to train new ADAS/AV systems or for verification/validation of features from those domains. The reason to use HD maps of real-world roads (rather than just generic, fictional routes created from scratch) is simple: In the end, you want your system to perform in the real world — so you want to optimize for real-world conditions as early as possible, starting in simulation. As we all know, the real world is nothing if not random, and you will encounter many situations you would rarely find in generic data sets.

So far, so good: These HD maps can be used to properly train lane-keep assistance or lane-departure warning systems, validate speed limit sign detection and many other systems. However, a map only contains the static features of an environment — what about ADAS/AV features that are supposed to react to other traffic participants? Emergency braking systems, cross-traffic alerts or adaptive cruise control are all required to perform differently, depending on how other cars, bikes or pedestrians around the vehicle are acting. For proper training or testing of such systems in simulation, HD maps alone are not sufficient.

Scenario data: HD maps plus traffic data

The solution to bridge this gap is rather obvious: You add traffic to “populate” your HD maps.

Scenarios consist of a static layer (the map) and a dynamic layer (traffic).

Real-world traffic, of course, is probably even more complex than real-world maps. Attempting to generically reconstruct the simplest traffic situations, such as a number of drivers coming to a halt at an intersection will fall short most of the time: Real drivers are human beings with infinite complexity, each with their own driving styles, preferences, vastly varying experience — and sometimes we have a bad day.

Once again, you want to optimize for real-world traffic situations while still in the simulation stage — so why not bring the real world into the virtual domain once more, capturing vehicles, pedestrians and more? The result are real-world scenarios, a perfectly aligned combination of HD maps and traffic, both built using data captured from survey runs on real roadways. Here’s a side-by-side-comparison, taken from an atlatec scenario:

Real-world traffic recreated in simulation

As you may have noticed, not all connecting arms along the route are part of the HD map used for this scenario, so some vehicles seem to appear or disappear off-road. This is a perfect example of recreating only the features that are of interest for any given case — or to manipulate the data in ways that allows for testing of more rare cases: For example, you would want your front-view/radar system to correctly identify a vehicle pulling onto a road, even if it was pulling out “from nowhere”. This leads us right to the next topic: How do you use real-word data to simulate more extreme situations, or even accidents?

Scenario fuzzing: Manipulating real-world data for edge case identification.

If you want to find out where the limitations of your system lie, you will have to simulate some scenarios that are beyond their performance — failures, for short. This is of course a challenge: Looking at an emergency brake assist (EBA) as an example, you can hardly collect real-world data for instances where it failed — it is impractical to keep driving and hoping for an accident to occur to record it. This is where “scenario fuzzing” comes into play.

Using a toolchain that is optimized for scenario-based simulation, you can select certain variables of a scenario and manipulate them slightly. For example, you could raise the speed of the survey vehicle by a few km/h, or decrease the distance at which another car cuts in front of you. Keep doing this in ever so slight increments, and you will eventually end up with a fuzzed scenario where the EBA will no longer be able to prevent a crash — finding what’s commonly referred to as an edge case, or system performance limit. Combine the fuzzing for both variables (speed of the ego vehicle and cut-in distance), and you will identify which speed allows for which minimum distance and vice versa — resulting in a corner case, an instance where two edge cases meet.

Here is a different example, showing a white vehicle pulling out to reveal a stationary car on the lane ahead — in reality, on the left, it pulls out with ample time left for an emergency braking maneuver. In the manipulated version, on the right, it pulls out later, leaving less time for the system to identify and react to the hazard:

atlatec scenario-based testing
Reproduced real-world scenario (left) and a slight variation, after scenario fuzzing (right)

Leveraging scenario fuzzing of recorded data allows you to reap both benefits: Enhancing simulation realism and relevance by using real-world data and identifying edge/corner cases by incrementally manipulating scenario variables.

To discover this topic in more depth, and to see some video examples (including the one where the above screenshot was taken from), we recommend the presentation “Edge Case Hunting in Scenario Based Virtual Validation of AVs” from this year’s “Apply & Innovate” hosted by IPG Automotive:

“Edge Case Hunting in Scenario Based Virtual Validation of AVs” by atlatec CEO Dr. Henning Lategahn

Recording scenarios during Field Operational Testing

One opportunity to encounter a multitude of relevant scenarios or even edge cases in the real world is the Field Operational Testing phase (FOT): This is when OEMs, Tier1s or their partners conduct test drives on open roads, over thousands and thousands of kilometers. These test drivers take place when a system is considered safe enough for testing in public, as a prerequisite to final approval by regulators.

Of course, it is not uncommon to spot ADAS performance issues in the FOT phase — that’s what it’s for, after all. Typically, these will be issues that occur very rarely, either in very specific situations (such as near-edge cases) or only after a certain time of operation — this is, after all, the first time a system is being tested at scale in reality.

When such a situation occurs, it is a treasure trove of information for validation and verification engineers: All the onboard data recorded during these drives is analyzed in as much detail as possible, attempting to identify the cause and to fix errors, if there were any. However, onboard data will only tell you what a system “thought” happened: If you are hunting for false negatives or false positives (e. g. in sensor data), you need to match this data against what really happened.

To this end, you can to leverage scenario recording during FOT: When test vehicles are equipped to record HD map and traffic data, you can recreate the exact situation in which a system failure or near-failure occurred — including the precise road layout, a vehicle’s precise position and pose as well as the traffic that occurred around it at that time.

atlatec hardware sensor pod
Sensor hardware for scenario production mounted on a vehicle

Replaying these “micro scenarios” in simulation allows for much more comprehensive insights into the situation surrounding a performance issue identified during FOT. Additionally, by fuzzing the data you can play around with infinite “what if” questions, further drilling down into the precise cause and severity of any errors.

If you have any questions or would like to discuss how to leverage scenarios for your work, don’t hesitate to reach out via email or request a meeting with us.

Author: Tom Dahlström, atlatec Gmbh

--

--

atlatec Gmbh
atlatec Gmbh

At atlatec, we build HAD mapping technology. We use nothing but cameras and GPS to build high fidelity 3D maps for autonomous vehicles and adas.