EpiK Protocol
Published in

EpiK Protocol

Epik Protocol to Make Tesla AI Driving 10 Times More Efficient

What the human eyes see is not the real world.

Light in the extremely narrow wavelength range of 400nm to 700nm is the perceptual range of human vision. Faced with a real world where the colors are far more than human vision can perceive and the amount of information is greater than the brain can cognize, the brain and eyes have to cooperate to reduce the amount of message received so that we can gain the ability to “focus”.

The current traffic system is also designed upon the human visual perception and brain cognitive system.

To replace human driving with artificial intelligence, we need to start by simulating human perception. This time, Andy, a domain expert in smart transportation, will take us into EpiK Protocol’s “private tutoring” class on autonomous driving for Tesla. We’ll take a look at what the team did to fool Tesla’s pure vision autonomous driving system while getting it to work well in the simulated traffic system of the digital twin city.

Cars driving in a digital twin world

A powerful sensor suite + neural network computing “brain” is the key to ensure that autonomous driving can correctly sense the vehicle’s surroundings and make safe and reasonable decisions. The “two-wheel drive” of algorithms and data is critical to achieving more mature autonomous driving technology.

With the development of autonomous driving cars and the realization of commercial scenarios, the algorithm gap is gradually narrowing. Data has become an important factor that really affects the level and application of autonomous driving technology.

The data set collection and labeling required to train algorithms for autonomous driving in real-world road conditions can only be performed for up to 24 hours a day. On EpiK Protocol’s AI driving simulation tool platform, it can reach 11,000 hours/day.

Domain Expert Andy and his team have transferred the data set labeling necessary for autonomous driving algorithm training to the virtual digital environment of the simulation tool platform, building a perfect 1:1 replica of the real world for autonomous driving algorithms.

For autonomous driving algorithm training, the biggest challenge of labeling data sets in virtual digital environments is how to achieve “physical reality”. If the real world is not efficiently simulated, the data set based on it will be a “castle in the air”. Training or testing with inaccurate data will only lead to inaccurate results.

Therefore, to ensure the authenticity of the annotated data set, whether it is the acquisition and construction of static scenes and roads, the modeling of real vehicle driving behavior and human behavior, or the extraction and generalization of extreme road conditions, physically realistic vehicle dynamics and sensors, etc., all the modules used in the simulation tool are derived from the real world.

Andy and his team have adopted the most advanced digital twin technology in the world. On the one hand, the use of accurate labeled data sets allows people to have a more realistic and immersive experience in the virtual digital environment “metaverse” created with digital twin technology; On the other hand, the virtual digital environment generated based on annotated datasets can “rehearse” all real-world behaviors or events, which greatly reduces the operational trial and error costs for technology testing, business development and government decision-making.

Sensor simulation models built on static and dynamic scenarios with high accuracy and diversity are driven entirely by real-world data, using a physical-level sensor simulation approach based on real sensor calibrations. The simulation results then output an annotated dataset that is infinitely close to the real world.

To address the core concerns of self-driving perception algorithm training and testing, EpiK Protocol has developed the corresponding functions based on the AI driving simulation tool platform’s multi-sensor virtual annotation dataset. This includes, but is not limited to, simulation of physically realistic lighting, parametric weather systems, physical-level camera and LiDAR simulation based on real sensor calibration, support for 2D/3D wrap-around box true value output for multiple viewpoints such as vehicle-side, road-side and air-side, support for pixel-level true value output such as depth maps, semantic segmentation maps, instance segmentation maps, optical flow, and perfect synchronization of multiple sensor data sets, etc.

Digital twin world can simulate road conditions in different extreme climatic environments

And even then, to ensure the quality of the dataset, the annotated results need to be validated. The validation of the virtual annotated dataset is evaluated using a cross-validation method against the real physical world and using deep learning target recognition and semantic segmentation. The simulation dataset can complement the diversity of weather, lighting, and extreme operating conditions of the real dataset with pixel-level accuracy and perfect labeling. Models trained with a mixture of large amounts of simulated data and small amounts of real data can exceed the performance of models trained with large amounts of real data.

This multi-sensor, virtual annotation dataset collection based on the AI driving simulation tool platform is far less expensive than real-world data collection and annotation, and 40 times more efficient than real-world data annotation.

The virtual simulation tool developed by Andy’s team has been applied to the R&D of various common technologies of autonomous driving systems, providing a safe and controllable all-factor multi-level testing and evaluation technology support for intelligent decision control, complex environment perception, human-machine interaction and co-driving, vehicle-road cooperation and network communication.

At the same time, the simulation engine for building a real-world “metaverse” virtual digital environment mapping has been expanded to applications such as smart city traffic systems, traffic environment simulation, smart road facility deployment and monitoring, and wireless network construction evaluation.

It is believed that in the near future, autonomous driving, which has returned from the EpiK Protocol metaverse “private lessons”, will excel in its real-world role.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store