Integrating sonar and IR sensor plugin to robot model in Gazebo with ROS.

what is ROS?

ThiruVenthan
Arimac
5 min readDec 7, 2018

--

Robotics operating system (ROS) is an open sourced robotic middle ware licensed under the open source, BSD license. ROS provides services like communication between progress, low level device control, hardware abstraction, package management and visualization tools for debugging. ROS based progress can be represented as graph where process happens in nodes and node communicate with others to execute the overall progress.

what is Gazebo?

Gazebo is a robotics simulator which allows to simulate and test our algorithm in indoor and outdoor environment. Some of the great features of Gazebo simulator are Advance 3D visualization , support to various physics engines (ODE, Bullet, Simbody, and DART) and the ability to simulate the sensor with noise etc., which ultimately results in a more realistic simulation results

Gazebo user interface

Requirements

  1. Computer with Ubuntu 16.04.5 LTS
  2. ROS (Kinetic) installed and a basic understanding about ROS (tutorials)
  3. Gazebo_ros package installed.
  4. A catkin work space with robot URDF and world files. (a sample workspace : git clone https://thiruashok@bitbucket.org/thiruashok/rover_ws.git

Launching the Gazebo with the robot model

Go to the cloned directory and open the terminal (ctrl+alt+t) and run the following commands.

cd rover_ws
catkin_make
source devel/setup.bash
roslaunch rover_gazebo rover_world.world

You can see a robot in a simulation world, in gazebo as shown in figure below.

rover bot in gazebo simulation

Open another terminal and run the following command to see the available topics.

rostopic list

You will get the following as the output.

/clock
/cmd_vel
/gazebo/link_states
/gazebo/model_states
/gazebo/parameter_descriptions
/gazebo/parameter_updates
/gazebo/set_link_state
/gazebo/set_model_state
/gazebo_gui/parameter_descriptions
/gazebo_gui/parameter_updates
/joint_states
/odom
/rosout
/rosout_agg
/tf
/tf_static

From the results you obtained you can observe that there are no any topic related to sensors, but cmd_vel topic is available, so we can navigate the robot by sending commands (given below) to this topic. As robot is now using differential drive mechanism, by changing the linear x and angular z values you can move the robot around.

rostopic pub /cmd_vel geometry_msgs/Twist "linear:
x: 1.0
y: 0.0
z: 0.0
angular:
x: 0.0
y: 0.0
z: 0.0"

Adding sonar and IR sensor models to the robot model

Open the rover.xarco file in the rover_ws/src/rover_description directory, using your favorite text editor. Add the following code above “ </robot>” tag.

<joint name="ir_front_joint" type="fixed">
<axis xyz="0 1 0" />
<origin rpy="0 0 0" xyz="0.5 0 0" />
<parent link="base_footprint"/>
<child link="base_ir_front"/>
</joint>
<link name="base_ir_front">
<collision>
<origin xyz="0 0 0" rpy="0 0 0"/>
<geometry>
<box size="0.01 0.01 0.01"/>
</geometry>
</collision>
<visual>
<origin xyz="0 0 0" rpy="0 0 0"/>
<geometry>
<box size="0.01 0.01 0.01"/>
</geometry>
</visual>
<inertial>
<mass value="1e-5" />
<origin xyz="0 0 0" rpy="0 0 0"/>
<inertia ixx="1e-6" ixy="0" ixz="0" iyy="1e-6" iyz="0" izz="1e-6" />
</inertial>
</link>
<joint name="sonar_front_joint" type="fixed">
<axis xyz="0 1 0" />
<origin rpy="0 0 0" xyz="0.5 0 0.25" />
<parent link="base_footprint"/>
<child link="base_sonar_front"/>
</joint>
<link name="base_sonar_front">
<collision>
<origin xyz="0 0 0" rpy="0 0 0"/>
<geometry>
<box size="0.01 0.01 0.01"/>
</geometry>
</collision>
<visual>
<origin xyz="0 0 0" rpy="0 0 0"/>
<geometry>
<box size="0.01 0.01 0.01"/>
</geometry>
</visual>
<inertial>
<mass value="1e-5" />
<origin xyz="0 0 0" rpy="0 0 0"/>
<inertia ixx="1e-6" ixy="0" ixz="0" iyy="1e-6" iyz="0" izz="1e-6" />
</inertial>
</link>

The above lines of code integrate senor models (a simple square) to the robot model. For some basic understanding of URDF file of a robot refer this. Now launch the world again by the following code.

roslaunch rover_gazebo rover_world.world

By changing the origin rpy and xyz values within the joint tag the sensor position can be changed.

<origin rpy=”0 0 0" xyz=”0.5 0 0.25" />
rover with senor fitted on body

You can notice that the sensor model is now visible on top of the robot model.

Adding the sensor plugin for Sonar and IR

Gazebo_ros_range plugin can be used to model both the sonar and the IR sensor. This plugin publish messages according to sensor_msgs/Range Message format so that integration to ros can be easily done. To add the plugin to the open the rover_ws/src/rover_description/urdf/rover.gazebo file in your favorite text editor and add the following lines above “ </robot>” tag.

<gazebo reference="base_ir_front">        
<sensor type="ray" name="TeraRanger">
<pose>0 0 0 0 0 0</pose>
<visualize>true</visualize>
<update_rate>50</update_rate>
<ray>
<scan>
<horizontal>
<samples>10</samples>
<resolution>1</resolution>
<min_angle>-0.14835</min_angle>
<max_angle>0.14835</max_angle>
</horizontal>
<vertical>
<samples>10</samples>
<resolution>1</resolution>
<min_angle>-0.14835</min_angle>
<max_angle>0.14835</max_angle>
</vertical>
</scan>
<range>
<min>0.01</min>
<max>2</max>
<resolution>0.02</resolution>
</range>
</ray>
<plugin filename="libgazebo_ros_range.so" name="gazebo_ros_range">
<gaussianNoise>0.005</gaussianNoise>
<alwaysOn>true</alwaysOn>
<updateRate>50</updateRate>
<topicName>sensor/ir_front</topicName>
<frameName>base_ir_front</frameName>
<radiation>INFRARED</radiation>
<fov>0.2967</fov>
</plugin>
</sensor>
</gazebo>

The avove one is for the IR, you can simply copy paste and this again and set the gazebo reference to base_sonar_front and change the topicName and frameName to appropriate one. Now run launch the gazebo.

roslaunch rover_gazebo rover_world.world
rover with senor plugin loaded

Sonar and IR senor rays can be seen in the simulation world. To see the sensor reading superscribe to the appropriate topic. see the commands bellow

rostopic list

Now the output will be like this:

/clock
/cmd_vel
/gazebo/link_states
/gazebo/model_states
/gazebo/parameter_descriptions
/gazebo/parameter_updates
/gazebo/set_link_state
/gazebo/set_model_state
/gazebo_gui/parameter_descriptions
/gazebo_gui/parameter_updates
/joint_states
/odom
/rosout
/rosout_agg
/sensor/ir_front
/sensor/sonar_front
/tf
/tf_static

You could notice that sonar and IR sensor are publishing to the new topics namely, /senor/ir_front and /sensor/sonar_front.

rostopic echo /sensor/ir_front

Type the above code in a the terminal to observe the out put from the IR sensor

header: 
seq: 1
stamp:
secs: 1888
nsecs: 840000000
frame_id: "base_ir_front"
radiation_type: 1
field_of_view: 0.296700000763
min_range: 0.00999999977648
max_range: 2.0
range: 0.0671204701066
---
header:
seq: 2
stamp:
secs: 1888
nsecs: 860000000
frame_id: "base_ir_front"
radiation_type: 1
field_of_view: 0.296700000763
min_range: 0.00999999977648
max_range: 2.0
range: 0.0722889602184
---
header:
seq: 3
stamp:
secs: 1888
nsecs: 880000000
frame_id: "base_ir_front"
radiation_type: 1
field_of_view: 0.296700000763
min_range: 0.00999999977648
max_range: 2.0
range: 0.0635933056474

Like this, all the sensor (Lydar , camera, IMU) can be integrated to the robot model. This help a lot in validating the algorithm and finding the optimal sensor position without building the actual hardware fully.

--

--

ThiruVenthan
Arimac
Writer for

Undergraduate at department of electronic and telecommunication faculty of engineering University of Moratuwa