Autoware on Yonohub — Part 2

Ahmed Radwan
YonoHub
Published in
5 min readAug 24, 2020
Lidar Localization and Perception by Yukihiro Saito

We are pleased to announce the release of Autoware Localization and Perception blocks on Yonohub. With these blocks, you can use Autoware.AI directly on Yonohub without downloading it or setting up any special environment. Seamlessly integrate your algorithms and blocks and connect them to any dataset or a ROSBag. Share and publish your blocks directly on Yonohub to showcase your solution while protecting all IPs. Sign up now and start using Autoware.AI blocks.

This article is part of a series. Check out the full series: Part 1

In the previous article, we gave a quick overview of what is Autoware, why we use it on Yonohub, and how to create your Autoware.ai environment or use our ready-to-use environment to create a custom app.

Learning Objectives

In this article, we’ll discuss:

  • Autoware Localization using NDT Matching.
  • Autoware Detection using PointPillars.
  • Autoware Clustering using Euclidean Clustering.
  • Lidar Localization and Perception on Yonohub.

Autoware Localization — NDT Matching

In order for an autonomous car to know where it is in the world, it needs to rely on an accurate source of information that provides an accurate position and orientation of the vehicle. The first thing that comes to your mind is that GPS locations are enough but the error in GPS location estimates is around 2–10 m which is not in the acceptable range for an autonomous car. So, the error must be decreased in the range of centimeters.

Localization algorithms use a pre-recorded map, current lidar scans, and current position and orientation estimate (GPS) to get a more accurate estimate of position and orientation. With the use of matching algorithms, the position estimate magnitude of error can be less than 10 centimeters which is a huge improvement.

Autoware offers two localization methods which are ICP and NDT matching and its known that NDT converges from a larger range of initial pose estimates than ICP, and performs faster.

NDT Matching

As mentioned earlier, in order to get a precise estimate of the position and orientation of the vehicle, we need to use a matching algorithm to match between the current lidar scans and the pre-recorded map. So, how does the algorithm simply work?

NDT matching algorithm takes in the pre-recorded map and breaks the point cloud map into three-dimensional boxes (voxels) essentially assigning a probability distribution to each box. Then, for each point of the input scan, a score function is optimized by using the Newton optimization method. Thus, the pose (position and orientation) estimate becomes more accurate.

NDT Matching in Autoware

Autoware Detection — PointPillars

Object detection in autonomous driving is very important as the vehicle needs to know what are the objects that it’s surrounded by and where they are in the environment around it. So, a way to perform the object detection task is to use Lidar points to detect objects in the environment and this can be done effectively using the PointPillars method.

PointPillars method is a state of the art method for object detection based on Lidar Pointclouds and it uses a neural network that could be trained to detect cars, pedestrians, or other objects using Lidar Pointclouds. It simply works by applying a novel encoder that uses PointNets to learn a representation of Pointclouds arranged in vertical columns and that’s why it’s called PointPillars. Then the encoded features can be used with any 2D convolution architecture. This detection method achieved outstanding performance on runtime which was around 62 Hz with the original PointPillars method and 105 Hz with a faster version. The link to the paper is in the references section.

PointPillars Network overview

Autoware Clustering — Euclidean Clustering

Clustering is a way to divide an unorganized Pointcloud into smaller parts that represent objects found in a Pointcloud which means that every cluster of points represents a certain object in the environment. This method significantly reduces processing time for Pointclouds and it can be used for identifying objects in the environment before it can be further processed to classification.

Euclidean Clustering in Autoware

One way to perform clustering is Euclidean clustering which basically creates a KDtree out of the Pointcloud then finds the points that are near to each other based on Euclidean distance(straight-line distance between two points) and give them an id that represents their cluster. If you want to learn more about this method, you can read this article.

Lidar Pipeline on Yonohub

Let us start building the Lidar localization and perception pipeline on Yonohub. To make that we’ll need to:

Step 1: Setup the Project Files
In order for this pipeline to work properly, we’ll need to have some files for each module. For the Lidar data, we’ll be using a demo ROSBag that provides demo Lidar and GNSS data and it can be downloaded from here.
Also for the Lidar PointPillars module to work, we’ll need to download sample pretrained files from here.

Step 2: Getting Autoware.AI Blocks from YonoStore
Autoware.AI blocks are available for free on YonoStore to purchase them. Once you purchased them, it’s a drag and drop process.

Blocks to purchase:
(ROSBag — Demo Map Loader — Demo Vector Map Loader — nmea to pose — Voxel Grid Filter — NDT Matching — Lidar Point Pillars — Euclidean Cluster Detection — IMM UKF PDA Tracker — Visualize Objects — Rviz)

Step 3: Building the Pipeline
The following video shows how to build and run the pipeline and visualize outputs:

Also, instead of creating the pipeline from scratch, you can clone this Github repository into the “/MyDrive/” folder to get the pipeline and the required project files for it. Make sure to run DemoROSBagDownload.sh to download the demo ROSBag.

Yonohub

Yonohub is the first cloud-based system for designing, sharing, and evaluating autonomous vehicle algorithms using just blocks. Yonohub features a drag-and-drop tool to build complex systems consisting of many blocks, a marketplace to share and monetize blocks, a builder for custom environments, and much more.

Get $25 free credits when you sign up now. For researchers and labs, contact us to learn more about Yonohub sponsorship options. Yonohub: A Cloud Collaboration Platform for Autonomous Vehicles, Robotics, and AI Development. www.yonohub.com

If you liked this article, please consider following us on Twitter at @yonohub, email us directly, or find us on LinkedIn. I’d love to hear from you if I can help you or your team with how to use YonoHub.

References

--

--