How to use the ROS robot_localization package

Zillur
6 min readJan 26, 2023

--

In this article, I will walk you through the ROS “robot_localization” package used to properly localize a mobile robot on a map using multiple sensors. This article is only for beginners who want to test a sensor fusion model.

1. What is the use of this package?

First, let’s look at why we need this ROS package. Mobile robots such as agriculture robots, product delivery robots in a warehouse, or more advanced self-driving cars move from one place to another. We say the robot moves from one 3D point in the XYZ cartesian coordinate system to another XYZ point. Now, it needs to know how much it traveled in each direction, how much it still needs to go, and where is it exactly in the environment just like we humans use a map to locate ourselves in the world. Without having any idea of where we are, we feel lost. Just like we use google or apple map, a robot uses its sensors to receive localization data. We can get that data from multiple sensors such as wheel encoders, IMU, GPS, and LiDAR. In this article, we will talk about wheel encoders and IMU as they are cheaper than LiDAR and also readily available. A wheel encoder is a device attached to the wheel or motor of the robot. It counts the number of rotations per minute (RPM). Let’s say, the radius of a wheel is r meter and it rotates at N rpm, then the velocity in m/s is,

From the velocity, we can have the distance after multiplying it by time. For a differential driver robot (two-wheeled robot), please read this article. It describes how you can get the odometry message from the wheel encoder in ROS. Instead of the full odometry message, even if we get half odometry(twist) which is just the velocity part of the odometry message, that will be enough for us.

Another important data source is IMU. It provides the orientation of the robot (roll, pitch, yaw) of the robot with respect to the ENU(East, North, Up) frame. In the ENU frame, the X-axis of the IMU should be in the east direction, Y-axis in the north, and Z-axis in the up direction. It means when your robot is facing east, it should provide (0,0,0) as roll, pitch, and yaw respectively. Along with the orientation, IMU provides angular velocity in 3 directions and linear acceleration as well. Install the ROS package corresponding to your specific IMU to get the ROS IMU message.

Now, the question becomes why we need multiple sensors to locate the robot when we can do it using only one. The problem is none of those sensors are perfect. For example, the wheel encoder slowly drifts away from the actual values. In a slippery surface where the robot moves slowly even if the wheels rotate fast. In that case, the encoders provide faulty data. Similarly, IMU has its measurement noise. Besides, in the presence of an outside magnet, it does not work well. However, when we have more than one source, we can fuse the data from both and get a better result, and accuracy is vital here. That is why robots use redundant data. To fuse the data from multiple sources, the most popular algorithm is the Kalman Filter. This filter is so important in science and engineering applications that it is used in almost all sophisticated control systems including Moon and Mars missions robots. The filter was first designed for linear systems only. But most of the real-world systems are non-linear, so another version called Extended Kalman Filter(EKF) was developed. This article coherently explains how it works. You need basic knowledge of linear algebra, and statistics specially gaussian distribution to understand the theory. Now, implementing this EKF could be laborious. This is where our friend “robot_localization” package comes into play. This package is the implemented version of the EKF in ROS. All you need is to install it and edit some parameters and you are good to go without going through the mathematics and programming part. When we have the odometry from wheel encoders and the IMU message from IMU, we can use them in the “robot_localization” package.

2. What do we need?

First, we need an ubuntu 20 machine with ROS noetic installed. You could use ubuntu 18 too with ROS melodic. To test the sensor fusion package, we will use the Gazebo simulator though you can use an actual robot if you have one. To install the necessary files, check the implementation part. We need any two of these four messages:

  1. nav_msgs/Odometry
  2. geometry_msgs/TwistWithCovarianceStamped
  3. sensor_msgs/IMU
  4. geometry_msgs/PoseWithCovarianceStamped

Also, ROS coordinate frame transformation message is essential. Read the next section on why we need /tf message. Our simulator will give us all the necessary messages. To move the robot, we need Logitech Gamepad F710.

3. ROS TF

If you want to work on ROS and robotics, you must know everything about ROS frame transformation. It is essential for all robotics modules like perception, localization, control, and motion planning. In short, /tf gives us the coordinate system of all static and moving parts of the robot. For instance, every robot has a “base_link” frame which is usually the center point of the robot. Then each wheel has its frame such as “left_wheel_link” or “right_wheel_link”. Likewise, for LiDAR, Camera, GPS, and IMU, we need corresponding frames. Each frame has a relative position XYZ, roll, pitch, and yaw with respect to another frame. ROS /tf tree gives us all the relative information. For the “robot_localization” package to work properly, we need to assign the correct frame name. Please read the ROS wiki page to know more about publishing a frame coordinate in /tf topic.

Here, we need the “base_link” or “base_footprint” frame and the “odom” frame. “odom” frame is a world-fixed frame. It means, it will not change over time. The moment we start the “robot_localization” package, the “odom” frame will be X=0, Y=0, Z=0, roll=0, pitch=0, and yaw=0. From the starting position, as we move the robot, the “base_link” frame, which is a fixed frame in the body of the robot, starts to move as well. Let’s say after 10 secs, our “base_link” frame is X=10, Y=5, Z=0 position. It means, our robot moved 10m along X-axis and 5m along Y-axis from its initial position. Our “robot_localization” package will give us “odom” → “base_link” transformation through better odometry data, and we just need to input the corresponding message topics and /tf.

4. Implementation

The details of the implementation and required ROS files can be found at https://github.com/zillur-av/localization-stack.git

We will use Husky Robot Simulation in Gazebo to test our sensor fusion package.

In the configuration file, we have to define our coordinate frame like the above image. For this package, we do not need to define “map_frame”. But we must define “odom_frame”, “base_link_frame”, and “world_frame” which will be like “odom_frame”. We define “publish_tf” as true, which means, the package will publish the “odom” → “base_link” transformation.

We also need to define which data we want to fuse. You can see from the above image, here we are fusing one odometry message (odom0: topic name) and one IMU message(imu0: topic name). In the odom0_config and imu0_config, there are 15 elements. Each should be either true or false. These elements are: (X, Y,Z, roll, pitch,yaw, X_velocity,Y_velocity,Z_velocity, roll_velocity,pitch_velocity,yaw_velocity, X_acceleration,Y_acceleration,Z_acceleration). Setting one entry to true means we are fusing that data. Be sure to modify them according to your sensor. The accuracy of the fusion model highly depends on the fusion configuration.

5. Result

Once you run the simulation and our package, you will see the non-filtered and filtered odometry in the RVIZ window. To print the filtered odometry message, run “rostopic echo /odometry/filtered” in the command terminal and you will see the better odometry messages.

That’s it for today. If you want to learn more about sensor fusion and improve performance, go through the mathematical part of the Kalman Filter and covariance matrix.

--

--