ROS 2 Mobile Robotics Series — Part 2

Sharad Maheshwari
7 min readAug 29, 2022

--

Note — This series is a technical documentation of me building/learning — Mobile Robotics in ROS2.
Check out Part 0 to understand how this series is set up

In Part 1, we set up TurtleBot3 simulation in Gazebo, which forms the basis of all our mobile robotics experiments in ROS 2.

What’s next? — We now understand how to set up autonomous navigation in ROS 2 using Nav2. In ROS 1, we used the Navigation stack, which was a combination of multiple packages like move_base, amcl and map server, to name a few

But Nav2 is the new cool guy in town, with most of these packages already a part of it, and enhanced functionalities. The makers wanted to get “ ROS Navigation out of the lab”, or simply, make it production quality.

How do we set up autonomous navigation in ROS 2?

To answer this question, let’s first look at some robot navigation concepts, shall we?

For a tiny second, put yourself in the robot’s shoes you’re asking to move autonomously. Let’s say someone says — “Ey, move to location B!”

What would you do? Well, I would ask the following two questions —

  1. Where am I?
  2. Where is location B, or in other words, what is my surrounding (map) like?

And that’s what a Robot actually needs—

  1. Localization — Where is the robot?
  2. Mapping — What does the map/world look like?

Technically, that’s what the robot brain needs in order to process this info and decide to move. In this case, our robot’s brain is the Navigation package. Intuitive, eh?

So we’ve defined the two pieces of information needed to navigate.

Since this is still a basic post, let’s avoid theory (which is important, but for later), and move to Nav2 package in ROS 2

As we established, Nav2 needs the robot’s current location and the map.
In ROS/ROS2, this translates to two requirements—

  • Robot Location — map -> odom transform and odom -> base_link transform

The expected inputs to Nav2 are the two TF transformations conforming to REP-105.

It is the job of the global positioning system (GPS, SLAM, Motion Capture) to, at minimum, provide the map->odom transformation. Usually, amcl which is a part of Nav2 is used for this.

It is the role of the odometry system to provide the odom -> base_link transformation. Odometry can come from many sources including LIDAR, RADAR, wheel encoders, VIO, and IMUs. The goal of the odometry is to provide a smooth and continuous local frame based on robot motion. The global positioning system will update the transformation relative to the global frame to account for the odometric drift.

Robot Localization is typically used for this fusion. It will take in N sensors of various types and provide continuous and smooth odometry to TF and to a topic. A typical mobile robotics setup may have odometry from wheel encoders, IMUs, and vision fused in this manner.

The smooth output can be used then for dead-reckoning for precise motion and updating the position of the robot accurately between global position updates.

  • Map — In applications needing SLAM (Simultaneous Localization and Mapping), this is done in parallel to navigation. But currently, we are interested in using a static map (so we know the map of the surrounding already) for navigation. In ROS 2, SLAM Toolbox can be used to generate a static map and we can save it for navigation.

If all this makes sense, awesome. If not, be patient. Making this work in ROS 2 below will clear things up :)

From the deepest point of my heart, I believe in learn-while-you-build strategy. Sooooo, shall we?

Setting up ROS 2 based Navigation with TurtleBot3 simulation—

Step 1 — Set up ROS 2 and TurtleBot3 in gazebo for simulation
Please check our Part 0 and Part 1 to complete this, if you haven’t already

If you did, you should be good with running TurtleBot3 simulation in gazebo.

Step 2 — Set up localization

First up, run TurtleBot3 in gazebo

Source the terminal for ROS 2 —

source /opt/ros/galactic/setup.bash

Go to TurtleBot3 package we built in Part 1, then source the terminal —

source ./install/setup.bash

Run TurtleBot3 world simulation —

ros2 launch turtlebot3_gazebo turtlebot3_world.launch.py

You should see the simulation running —

Now, in another terminal, source ROS 2 and check the list of topics —

ros2 topic list

Look look! There’s odom topic, which gives us odometry values and odom->base_link transformation. So we have localization ready, right?

Well, while we will use this, in the general case, not quite.
When you use a real robot, these odometry values are noisy and are prone to drift. So we need to fuse them with other sensor outputs like IMU (local), and GPS (global) using Robot Localization Package.

In our case, odom barely has any noise (I haven’t found a way to increase the model’s odom noise without re-publishing in code). So we’ll use this.

Thus, for us, localization values are ready.

Step 3 — Set up a static map

To do this, we use the SLAM toolbox in ROS 2

  • Install SLAM toolbox
sudo apt install ros-<ros2-distro>-slam-toolbox

I’m using Galactic, so —

sudo apt install ros-galactic-slam-toolbox
  • Launch slam toolbox package for mapping —

In a new terminal (after sourcing ROS 2 of course, always do that) —

source /opt/ros/galactic/setup.bash
ros2 launch slam_toolbox online_async_launch.py
  • Launch Rviz2 with necessary visualisations

In a new terminal, launch rviz2

source /opt/ros/galactic/setup.bash
rviz2

Add the following for visualisation

  1. laser scan — topic /scan , reliability policy Best Effort
  2. map — topic /map
  3. tf

Here’s my Rviz view —

  • Run teleop for moving the robot around for mapping —

In a new terminal —

source /opt/ros/galactic/setup.bash
ros2 run teleop_twist_keyboard teleop_twist_keyboard
  • Put Gazebo and RViz side by side, and teleop to see the map being created in RViz —

I usually slow down the bot significantly for a better map

  • Save the map

In a new terminal, run nav2_map_server

source /opt/ros/galactic/setup.bash
ros2 run nav2_map_server map_saver_cli -f ~/map

This saves the map in our home directory (map.pgm and map.yaml)

That was mapping.

And. And. And. We’re ready to navigate next!

Step 4 — Autonomous Navigation

Please kill slam_toolbox node, teleop node and RViz. We don’t need them anymore. At this point, only TurtleBot3 gazebo simulation should be running

Now that we have localization(/odom topic) and mapping (our saved map) ready, we can start navigation

  • Launch Navigation

In a new terminal —

source /opt/ros/galactic/setup.bash
ros2 launch nav2_bringup bringup_launch.py use_sim_time:=True autostart:=True map:=/path/to/your-map.yaml

This will set up all nodes needed for autonomous navigation, but with a warning about map frame not existing. We’ll solve this soon.

In another terminal, launch RViz, but with pre-set visualisation —

source /opt/ros/galactic/setup.bashros2 run rviz2 rviz2 -d $(ros2 pkg prefix nav2_bringup)/share/nav2_bringup/rviz/nav2_default_view.rviz

RViz view —

You’ll see a Global Status Error, which is about the unavailability of map.
The thing is, map->odom transformation is published by amcl which was launched as a part of nav2_bringup above. But amcl needs an initial pose to start working.

To provide that, click on “2D pose Estimate” (one of the green arrows) on the toolbar above and provide the current pose based on gazebo (position and heading) by clicking and dragging on the map in RViz. You should be as precise as possible (but minor errors are fine).

Once you do that, this should be the new view —

Now, we have all the transforms needed for autonomous navigation: map->odom->base_link

  • Navigation

Started from the bottom, and here we are ;)

Let’s navigate!

To command the robot to move autonomously, go to RViz and click on Nav2 Goal . Now, click (for position) and drag (for orientation) on the destination in RViz. Next? Watch the magic happen in Gazebo!

You will see that nav2 planned a path (local + global) and started moving the bot. If this made you happy, you can keep giving more Nav2 Goals!

So, we’ve successfully set up autonomous navigations using Nav2 for TurtleBot3, eh? Pretty cool, I’d say.

That was it for this post.

Parting Notes —

What’s next you ask?

I’d ideally want to continue this series with a physical TurtleBot. Maybe I crowdsource a TurtleBot. Maybe something else.

I’d love to know what you want. We can figure it out together!

Cheers!
Sharad

--

--