Validation: how accurate is the Anyverse sensor simulation pipeline?
In the previous chapters of this Anyverse sensor simulation insights series, we have presented and delved into each of the stages, but at this point, you may ask, how accurate or close to real sensors is the Anyverse sensor simulation pipeline implementation described here?
Fair enough. In this chapter, we describe the validation method we have run to clear any doubts.
Validation
We have validated the Anyverse™ sensor simulation pipeline using the ISET Toolbox for Matlab from Stanford University. This toolbox allows for the simulation of a camera sensor and has been validated by comparing simulated data with real data [1][2].
We used the ISET toolbox to obtain the colors from the Macbeth color checker[6] using some specific sensor configuration. We used the same sensor configuration to obtain the colors using Anyverse™ ’s sensor simulation. Below the images obtained from the two simulators.
In the table below we show the colors using the CIELAB color space for the Macbeth’s 24 patches from the two simulators. Notice how they are in good agreement.
In the chart below we show a scatter plot for the color values from the two simulators. The trend shows that the results are very similar.
Don’t miss the final chapter
If you haven’t missed any of the previous chapters, you already have a notion of the importance of a correct sensor simulation to train and test deep learning-based perception systems with synthetic data.
No simulation means your synthetic data may not be that useful to deploy your autonomous system. It may be more difficult for neural networks to generalize to real-world images. Because at the end of the day, that is every perception system’s goal: understand the real world and interpret it.
Don’t miss the final chapter of this insight series next week to wrap up everything we have learned together during the last weeks.
References
[1] https://en.wikipedia.org/wiki/Moir%C3%A9_pattern
[2] Farrell J.E., Wandell B.A. (2014) Handbook of digital imaging — Image Systems Simulation, 373–401
About Anyverse™
Anyverse™ helps you continuously improve your deep learning perception models to reduce your system’s time to market applying new software 2.0 processes. Our synthetic data production platform allows us to provide high-fidelity accurate and balanced datasets. Along with a data-driven iterative process, we can help you reach the required model performance.
With Anyverse™, you can accurately simulate any camera sensor and help you decide which one will perform better with your perception system. No more complex and expensive experiments with real devices, thanks to our state-of-the-art photometric pipeline.
Need to know more?
Visit our website, anyverse.ai anytime, or our Linkedin, Instagram, and Twitter profiles.