Nerd For Tech
Published in

Nerd For Tech

Can We Buy Self-driving Cars Right Now?

How Far Away is the Far Away?

Source: Apollo Moon

New Generation of Mass-production Sharing Unmanned Vehicle Apollo Moon

On June 17, 2021, Baidu Apollo and Arcfox jointly released a new generation of mass production sharing unmanned vehicle Apollo moon, which has an operating cycle of more than five years and can provide long-term and stable automatic driving travel service. Baidu plans to launch 1000 Apollo moons in the next three years to support the auto-driving operation team.

Baidu disclosed for the first time that the cost of the Apollo moon is 480,000 RMB($74,160). Mr.Wang, vice president of Baidu and general manager of automatic driving technology, said that the 480,000 RMB includes cars and driverless kits, which is the lowest cost driverless car in the world and the first time that driverless cars have entered the price range of ordinary mass-production passenger cars.

Compared with the cost of millions of self-driving vehicles in the industry, Apollo moon’s lower-cost advantage of 480,000 RMB($74,160) seems to make self-driving a step closer to commercialization.

Tier1, OEMs, and technology companies are exploring how to realize the commercialization of automatic driving. But it’s hard to predict how long this process will take.

This time, the price of 480,000RMB should be satisfied. However, for ordinary consumers, it is still not friendly enough, it is difficult to push it to the ordinary market, and it is still too early for large-scale popularization.

Commercial Application Process of Automatic Driving is not Smooth

Even if the cost is not mentioned, in recent years, the commercial application process of automatic driving is not smooth, the application scenarios are limited, and the occurrence of many accidents also makes people’s trust in automatic driving continue to decrease.

Recently, Waymo’s driverless car collided with an electric scooter. Fortunately, no one was injured. In addition, according to Reuters, the U.S. auto safety regulator said it had launched 30 investigations into Tesla car accidents suspected of using advanced driving assistance systems since 2016, involving 10 deaths.

Some analysts believe that the commercialization of automatic driving is affected by many factors, such as policy, perception system (system redundancy), technology including chips(software) and data, high-precision map, and infrastructure (v2x).

In terms of policy, from the perspective of the automatic driving promotion plan and the current development stage of major countries in the world, auto-driving rules and principles are in the process of continuous optimization for the realization of L4 level high automatic driving around 2025. United States, China, and Japan are in positive and steady progress (all have opened road testing). The EU is more cautious. Currently, only Germany, Sweden, Netherlands, Austria, and Belgium allow open road testing.

More info: The World’s First Driverless Law to Allow L4 Class Automatic Driving on the Road in Germany

Conclusion

Since the safety responsible party of automatic driving has switched from a single driver to a multi-party entity including the main engine plant(a high degree of integration of hardware), software, spare parts, and complete vehicle, the superimposed industrial chain is long and is still in the stage of technology incubation. The standardization and rationalization of policy supervision will play a decisive role in the entry threshold, technology promotion, and commercialization landing.

Key Technologies of Auto-driving

At present, the key technologies of auto-driving include environmental awareness, precise positioning, decision-making and planning, control and execution, high precision mapping and vehicle networking V2X, driving vehicle test, and verification technology.

With the support of this technology system and key hardware and software equipment, the self-driving vehicle can perceive the surrounding environment through sensors such as vehicle cameras, LiDAR, millimeter-wave radar, and ultrasonic sensors, make decisions based on the information obtained, and form a safe and reasonable path planning. After planning the path, the vehicle execution system will control the vehicle to complete driving along with the planned one.

This set of core technology systems of autopilot can be simply summarized as “perception, decision-making, and implementation”.

Among them, sensors undertake sensing work.

The sensing system is also called the “middle control system”, which is responsible for sensing the surrounding environment, collecting and processing the environmental information and the information inside the vehicle, mainly involving road boundary monitoring, vehicle detection, pedestrian detection, and other technologies.

To realize automatic driving, one problem should be solved at first: driving safety. In order to ensure that the autonomous vehicle can make the correct decisions in various scenarios, it is necessary to realize real-time dynamic collection and identification of the surrounding environmental data, including but not limited to the status of the vehicle, traffic flow, road condition, traffic sign, etc.,

In other words, environmental perception plays a similar role as human driver’s eyes and ears.

In order to meet the needs of environmental perception, autopilot cars are equipped with many vehicle sensors, such as cameras, LiDAR, millimeter-wave radar, ultrasonic, and so on. Under the cooperation of these sensors and V2X, multi-source information such as traffic environment and vehicle status can be obtained in real-time, providing support for decision-making.

More info: Application of 3D Point Cloud Annotation in the Field of Automatic Driving

Application of LiDAR

LiDAR generates high-definition 3D “point cloud” images of the surrounding objects from the data obtained by high-frequency emission of multiple lasers. LiDAR has been widely considered as a necessary sensor to realize unmanned driving.

What is 3D Point Cloud?

3D Point Cloud is a collaboration of numerous dots (data points) spread throughout a 3D space, where data points are collected through sensors like LiDAR. The sensors emit light and calculate the time it takes to be reflected back to create each dot. The collected dots are compounded to present a complete image as shown below.

More info about LiDAR and 3D Point Cloud

Source: Velodyne LiDAR

3D Point Cloud is widely used for product development and analysis in fields related to architecture, aerospace, driving, traffic, medical equipment, regular consumer items, and more. The potential use cases and applications are only expected to increase in the future.

ByteBridge 3D Cloud Annotation Solution

ByteBridge self-developed 3D Point Cloud labeling, quality inspection tool, and pre-labeling functions can complete high-quality and high-precision 3D point cloud annotation for 2D-3D fusion or 3D images provided by different manufacturers and equipment, and provide one-station management service of labeling, QA, and QC.

More info: ByteBridge Launches World’s First Mobile 3D Point Cloud Data Labeling Service

ByteBridge 3D Point Cloud Annotation Tool

3D Point Cloud Annotation Types:

  • Sensor Fusion Cuboids: 49 categories include car, truck, heavy vehicle, two-wheeled vehicle, pedestrian, etc.
  • Sensor Fusion Segmentation: obstacles classification, different types of lanes differentiation
  • Sensor Fusion Cuboids Tracking

① Tracking the same object with the same ID, labeling the leaving state;

② Time-aligned 2D images could be provided, point clouds outputs only.

Advantages of Our 3D Point Cloud Annotation Service:

  • Support 2D/3D sensor fusion, support multiple cameras
  • Support scalable data annotation
  • AI-powered sensor fusion tool: labeling at 2X-5X speed
  • Ease of using QC tool: real-time revision and synchronous feedback
ByteBridge 3D Point Cloud QC Tool

Cost-effective

A collaboration of the human-work force and AI algorithms ensure a 50% lower price compared to the conventional market.

End

ByteBridge is a data labeling platform with robust tools for real-time workflow management, providing high-quality data with efficiency.

If you need data labeling and collection services, please have a look at bytebridge.io

If you would like to have a look at the 3D point cloud live demo, please feel free to contact us: support@bytebridge.io

Source:http://www.myzaker.com/article/60cee0268e9f0913f4009bcc

--

--

--

NFT is an Educational Media House. Our mission is to bring the invaluable knowledge and experiences of experts from all over the world to the novice. To know more about us, visit https://www.nerdfortech.org/.

Recommended from Medium

How to be wiser this time around

This is why smartphone specs don’t matter as much as you think

Logic Level Shifter Design

#3: Our Favourite Finds from The Past* Week

Latin America’s Mobile Money Agents are the Drivers of Real Financial Inclusion

Your definitive guide to Venice VR 2020

Cloud-Hosted PBX based Business Communication

An Apple Leak Says Apple Will Crack Down on Leaks

This is the picture of a sky blue iPhone 13.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
ByteBridge

ByteBridge

A data labeling platform with robust tools for real-time workflow management, providing high-quality training data with efficiency. — https://bytebridge.io/#/

More from Medium

Why We Reward Value Creation Only

Developing an RL agent for cognitive impact

Car Claims Insurance Technology — Insurtech

Elliptic Curve Cryptography