AVs in a sim city: What simulation testing is and what it isn’t

Svetlana Kudriashova
Evocargo
Published in
6 min readJan 15, 2024

--

In mass culture, simulation is associated with gaming, sci-fi and the important question ‘Are we all living in The Matrix?’ (not sure if I want to know the answer). In a simulation, we can take a risky path, challenge our reactions and resilience, and hit and crash without doing any harm in reality. That’s fun. But such imaginary worlds can also serve a bigger purpose. We can insert digital versions of various objects and mechanisms into a digital simulation to see how they behave in stress situations. This is basically how simulation is used for testing autonomous vehicles.

At Evocargo, we design and build electric autonomous vehicles (AVs) (Level 4 autonomy by SAE standard J3016_202104) and a logistics service to move cargo around industrial yards and large logistic hubs. Our experience of commercial in-hub exploitation shows that AVs hold enormous potential to transform cargo logistics and boost its efficiency. But broad adoption of autonomous technologies — on public roads, for instance — will only be possible when they are proven to be safe. AVs would need to drive hundreds of millions of miles in real urban areas in order to produce statistics and prove conclusively that the risks from their use are minimal. (To learn more about mileage stats, see the article Driving to Safety: How Many Miles of Driving Would It Take to Demonstrate Autonomous Vehicle Reliability?) Simulation testing can partially solve this problem by allowing developers to run multiple tests at once and drive millions of miles virtually.

Let’s see exactly what a simulator can and cannot do, the place of simulation tests in the Evocargo pipeline, and how such tests make life easier for developers and QA engineers.

What a simulator is

A simulator is a platform that lets us virtually recreate specific environments and situations and test various driving scenarios. The most popular simulators are NVIDIA DRIVE Sim, DeepDrive, CARLA, and a few others (see a longer list here). Our engineers chose CARLA, an open-source simulator built on top of Unreal Engine 4, in order to test the autonomous driving algorithms that we use in Evocargo vehicles.

A simulated 3D world contains various digital assets such as buildings, lane types, vehicles, and many more

In CARLA, we simulate the sensors that are installed on our vehicles: lidars, cameras, IMU, GNSS, and encoders. Based on this set of sensors, we run basic tests on some autopilot features related to

  • Interfaces such as autopilot API, interaction between autopilot systems,
  • Navigation: localization, route planning, control systems,
  • Perception: obstacle detection, drivable area segmentation,
  • Safety and diagnostics

For example, we check whether our vehicle can detect an obstacle in its way and create an alternative route to drive around it without stopping. Or whether our vehicle reverses up to a loading gate with sufficient accuracy to ensure smooth loading.

We also can virtually recreate specific driving conditions in order to run one and the same scenario in day and night lighting, on a straightforward route and via tangled road junctions, in sunny and rainy weather, with more and fewer obstacles, etc. We tune multiple parameters to get a particular combination of factors and save it as a weather configuration preset.

Tough weather preset used for testing Evocargo AVs

Our worst-case weather scenario simulates

  • high fog density with low (5-meter) visibility,
  • heavy rain and lots of puddles on the road,
  • mid-level cloudiness and wind,
  • and low lighting to make things even worse.

We can also script dynamic changes to the weather.

Note. Although CARLA offers good configuration capabilities, not every sensor can be added by the standard means. For example, there is no support for encoders. To simulate them we need to follow these steps:

  • Get info about the wheel size from the model in Unreal Engine
  • Get the vehicle’s speed from Unreal Engine
  • Convert the speed to ticks
  • Apply noise model to measurement

What a simulator is not

At Evocargo, we think that simulation is not a substitute for testing at a real test site, and simulated data is not a substitute for a real-world dataset. However, it is a superb complement to them. It not only lets us drive virtual mileage, but also helps our engineers discard updates that would be bound to fail at the real test site without wasting time and effort.

Notably, CARLA simulation allows us to test almost all autopilot subsystems, except for those deeply related to hardware (e.g. drivers). It might not substitute specialized tests for autopilot subsystems (e.g. real data for perception), but it is a solid way to check scenarios that require interaction between autopilot software modules.

Driving in simulation

The video travels a short section of a standard route. On this section we check basic autopilot functionality: how the vehicle drives autonomously to the specified destination and how well it drives around an obstacle, makes turns and reverse parks.

Simulation as part of our safety-assurance pipeline

Simulation testing is one of the many steps we take to ensure the safety and reliability of Evocargo service. It is seamlessly integrated in our autopilot testing pipeline, so the tests run automatically as part of CI.

Autopilot testing pipeline

In case of a failure, an engineer can view the list of all launched tests and the detailed report on each test.

We’ve also made the simulation testing infrastructure easily reachable. If developers want to check how their changes affect autopilot features before merging, they can do it manually at any time. For this we have set up remote desktop access to the manual launch of tests.

And it’s also scalable. Dozens of tests can run simultaneously, and more test units can be added when needed.

Simulator architecture

As mentioned above, we simulate the work of five sensors in our tests — lidars, cameras, IMU, GNSS, and encoders. Here is an example of simulator architecture for two of those sensors (lidars and GNSS).

Simulator architecture for lidars and GNSS

Entities simulating the sensors are implemented in Unreal Engine 4 — the LidarActor imitates laser light emission and gives info about the distance to objects, and the GPSActor provides the vehicle’s global location on the map. CARLA framework is used on top of it for experiments with autonomous systems.

Challenges to overcome

Even though simulators are the crucial tool to self-driving development, there are several open issues which need to be figured out to even further increase their usefulness:

  1. More accurate traffic participants simulation. The majority of simulators provide an API to program road traffic agents (e.g. see SUMO integrated with CARLA). However, it is still tedious to create multiple agents. Moreover, they follow simple rule-based behavioral models, while the behavior of a self-driving car should influence the behavior of the agents. So, the more intelligent logic for such agents, for instance neural-net-based logic, should be developed. This problem is described in detail in this video https://www.youtube.com/watch?v=S59lIhwU4dA
  2. Adverse weather simulation. Gathering data for perception algorithms is where simulators could come in handy. However, in many of them adverse weather conditions simulation is done poorly. Rain and snow often only affect image data (see CARLA 0.9.14 Release) and does not create noise in other sensors’ data such as lidar point clouds.
    Generative AI tools (like gaia-1) can produce various driving scenes, but they have not evolved enough to be integrated in simulators and solve the stated problem yet. Though they look very promising.
  3. Map reconstruction. One of the challenges in self-driving is deployment in new regions. Testing autopilot in the new surroundings takes time. If only we could take a realistic map of the region and reconstruct the target environment to run extensive tests in simulator. Unfortunately, precise 3D map reconstruction is currently unfeasible. However, there are certain research directions that look promising, for instance advances in 3D computer vision and the NKSR architecture (Neural Kernel Surface Reconstruction (nvidia.com) that recovers a 3D surface from an input point cloud.

Want to dive in and learn more about unresolved issues within simulators? See the article Choose Your Simulator Wisely: A Review on Open-source Simulators for Autonomous Driving and AutoSens conference Review: Addressing 5 key challenges for autonomous vehicle simulation.

--

--

Svetlana Kudriashova
Evocargo

Developer Relations Manager at Evocargo, editor-in-chief of the Evocargo tech blog