Autoware on Yonohub (Vision pipeline) — Part 3

Ahmed Radwan
YonoHub
Published in
4 min readSep 16, 2020

This article is part of the Autoware series. Check out the full series: Part 1, Part 2

We are pleased to announce the release of Autoware Vision blocks on Yonohub. With these blocks, you can use Autoware.AI directly on Yonohub without downloading it or setting up any particular environment. Seamlessly integrate your algorithms and blocks and connect them to any dataset or a ROSBag. Share and publish your blocks directly on Yonohub to showcase your solution while protecting all IPs. Sign up now and start using Autoware.AI blocks.

The previous article gave a brief introduction about Autoware and showcased Autoware.AI Localization and Perception blocks that can be used directly as a plug and play. In this article, we’ll show how to build a pipeline using vision algorithms from Autoware.AI. We’ve also made the blocks so that it can be used easily as drag and drop. You won’t waste time installing libraries and building packages. You just need to drag and drop the blocks you need to use and adjust the parameters and click the launch button. Also, the vision blocks are compatible with datasets available on Yonohub: nuScenes, KITTI, BDD, Comm.ai, and ApolloScape.

Learning Objectives

In this article, we’ll discuss:

  • Autoware Segmentation using ENet.
  • Autoware 2D Object Detection using YOLO.
  • Autoware Object Tracking using Beyond Pixels Tracker.

ENet Segmentation

Semantic segmentation is one of the tools that help perceive the environment surrounding autonomous vehicles and plays an essential role in finding target objects. ENet Segmentation stands for “efficient segmentation network,” and it’s meant for low latency operations. It’s 18 times faster than any algorithm made before it and requires 75 times fewer FLOPs, and has fewer parameters than previous models, and above all, it provides better accuracy.

ENet Segmentation Output
Benchmark Table for ENet

YOLO Object Detection

YOLO stands for “You Only Look Once,” and it’s a very powerful, highly efficient, and real-time object detection algorithm. The algorithm is regularly updated to be more potent than it’s at the moment.

YOLO consists of a single CNN that predicts multiple bounding boxes and class probabilities for those boxes. It’s trained on full images and directly optimizes detection performance.

Beyond Pixels Based Tracker

Detected object tracking is an essential component of scene understanding. Hence, it’s vital to use when detecting objects from camera images.

Beyond Pixels Tracker is a novel approach for multi-object tracking in road scenes. This method uses detected objects from YOLO or any other object detector and tracks all the detected items for each frame.

Beyond Pixels Tracking results on some challenging sequences

Vision Pipeline on Yonohub

Let’s make an Autoware based vision pipeline that applies segmentation, YOLO object detection, and tracking. To make that, we’ll need to:

Step 1: Setup the Project Files
For this pipeline to work properly, we’ll need to have some files for each module. For the image data, we’ll be using Kitti ROSBag that provides camera images along with Lidar, GPS, and IMU data, and it can be downloaded using this tutorial.

For the segmentation module to work, we’ll need to download sample pre-trained files from here.

For the YOLO object detection block to work, we’ll need to download the weights and config files from here.

Step 2: Getting Autoware.AI Blocks from YonoStore
Autoware.AI blocks are available for free on YonoStore to purchase them. Once you bought them, it’s a drag and drop process.

Blocks to purchase:
(KITTI raw rosbag — ENet Segmentation — YOLO — Byond Track — Camera Objects Visualizer — Rviz)

Step 3: Building the Pipeline
The following video shows how to build and run the pipeline and visualize outputs:

Yonohub

Yonohub is the first cloud-based system for designing, sharing, and evaluating autonomous vehicle algorithms using just blocks. Yonohub features a drag-and-drop tool to build complex systems consisting of many blocks, a marketplace to share and monetize blocks, a builder for custom environments, and much more.

Get $25 free credits when you sign up now. For researchers and labs, contact us to learn more about Yonohub sponsorship options. Yonohub: A Cloud Collaboration Platform for Autonomous Vehicles, Robotics, and AI Development. www.yonohub.com

If you liked this article, please consider following us on Twitter at @yonohub, email us directly, or find us on LinkedIn. I’d love to hear from you if I can help you or your team with how to use YonoHub.

References

https://www.autoware.ai/
https://gitlab.com/autowarefoundation/autoware.ai
https://github.com/tomas789/kitti2bag
https://arxiv.org/abs/1606.02147
https://arxiv.org/abs/1802.09298
https://pjreddie.com/darknet/yolo/

--

--