Nerd For Tech
Published in

Nerd For Tech

Case Study of 2D Object Detection and Annotation in the Self-Driving Industry

In the previous article, we are talking about a data annotation project for unmanned logistics vehicles.

2D Object Detection Labeling Case Study in the Self-Driving Industry

Now we continue to see how 2D object detection labeling is done with more project instruction.

5. Labeling Target

(1) Immovable categories only need to be labeled when they are on the road; All other obstacles in the image need to be labeled, no matter whether they are on the road or not.

(2) Only label targets with pixel size no less than 15 * 15.

(3) Only label objects whose visible proportion is no less than 1/3.

Note 1: Cars under the cover should be labeled and cars without wheels should also be labeled in accordance with the vehicle type, with wheel points not labeled.

Note 2: When a vehicle opens the doors or is towing goods, only label the vehicle body, excluding goods, doors, and rearview mirrors.

Note 4: Do not label rearview mirrors and the doors (if the doors are open) into the box and only label the vehicle itself.

6. Continuous Frame Labeling and Single Frame Labeling

For 2D visual obstacle annotation, they may be divided into continuous frame dataset annotation and single frame dataset annotation.

For datasets where continuous labeling is required, it is necessary to ensure that the ID of a single obstacle in the entire dataset remains unique.

7. Annotation Rules At Night

(1) The labeling rule at night is based on the rules during the daytime.

(2) Objects listed in a given obstacle type are labeled as required, those listed in the category of omission are labeled as omission.

Those that cannot be determined as obstacles don’t need to be labeled and those obstacles that are not listed in any given type are labeled as unknown.

(3)The car model (body) is the final identification for whether the obstacle needs to be labeled, with the light only as an auxiliary reference.

(4) Don’t label the objects that are so vague that cannot be determined as obstacles.

(5)The obscured and truncated parts of the obstacle should be labeled.

(6)Ignore the dizzy light and imagine the complete size of the obstacle.

Note 1: For images from the wet camera due to rain, snow, and other reasons, regardless of the image quality, label it as required as long as the category of obstacles can be determined.

Label the object as an omission if its category cannot be determined but it must be on the road and don’t label it if it cannot be identified at all.

Note 2: For distant objects that are difficult to determine whether they are obstacles, if they are in the road range, label them as required if their categories can be identified. Label them as unknown if their categories cannot be identified. If they are not in the road range, label them as omission.

You Configure and ByteBridge Annotates MANUALLY

Only Three Steps to go

  • Log in with your email
  • Upload the sample
  • Tell us what to label: tell us the minimum labeling size and the precision you need.

You can send the requirement to us, and we will handle the configuration job.

Then it’s our turn.

Demo and quote would be ready in less than 24h on weekdays.


ByteBridge 2D Object Detection Annotation

JSON Output

ByteBridge 2D Object Detection Annotation JSON


Outsource your data labeling tasks to ByteBridge, you can get high-quality ML training datasets cheaper and faster!

  • Free Trial Without Credit Card: you can get your sample result in a fast turnaround, check the output, and give feedback directly to our project manager.
  • 100% Human Validated
  • Transparent & Standard Pricing: clear pricing is available(labor cost included)

Why not have a try?



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store



Data labeling outsourced service: get your ML training datasets cheaper and faster!—