Implementing DJI M100 Emulator in ROS-Gazebo for HITL

DJI Matrice 100

Nowadays aerial robots, or drones in popular sense, are used ubiquitously in our daily life. Although their true potential is not fulfilled yet, the number of applications based on these low-cost bad boys will increase dramatically in the following decade. DJI Matrice100 is such a product, which also offers an Onboard Software Development Kit for those who want to develop only software and rapidly prototype. In detail, a from-the-scratch drone project would involve unrelated steps for a CS person such as hardware integration, electrification and system identification before one can implement any useful application. That’s the reason why these type of robots offer a great amount of utility to the Robotics software community. Nonetheless, everything comes with a price. Having a ready-made SDK also restricts the developer from many aspects, mostly by the assumptions made by product owner while configuring it. I will not go into the inherent problems of DJI Matrice100 OSDK, if anyone interested can find a bunch of them in here.

So, let’s get started with the use-case. HITL means Hardware-in-the-Loop. Thanks to DJI, it offers its own minimalistic DJI Assistant for this purpose. In essence, it simulates the sensors on the robot as if it is really flying on the field. Those who have used DJI OSDK know that it is a must-to-use. Without it, we cannot simulate the sensors on the drone that allows HITL testing. So, there are three prerequisities for this tutorial:

  • A computer (I have tried on Ubuntu 16.04 and 18.04) that can install following packages. I presume ROS and Gazebo is already installed.
    - DJI OSDK 3.7
    - DJI OSDK-ROS 3.7
  • A Windows or Mac that can install DJI Assistant.
DJI Assistant — In Software View

The work I describe in this post is implemented in two packages, dji_m100_description and dji_m100_gazebo. Before diving into implementation details, I would like to give most of the credits to caochao39 from Github, since I have used his model of DJI Matrice100. Also, his bridge node was the main reason why I have started to implement the same functionality with a Gazebo plugin. My main contributions in this work are:

  1. Creating a proper gimbal URDF and splitting the robot description into modular parts, which makes the camera easily removable and addible.
  2. Implementing emulator as a Gazebo plugin. Previous work uses the front-end API of Gazebo through ROS services. ROS services are not designed for synchronous and high frequency tasks such as this, hence even having a very high frequency does not inhibit the flickering in Gazebo simulation with ROS services.

Description of Robot

DJI M100 with Gimbal in Gazebo
DJI M100 Top View
Gimbal URDF

In order to properly emulate the gimbal in DJI M100, we need to define 3 continuous joints that each can rotate only in x, y and z axes respectively. I don’t have any experience with multi-joint links, therefore I have created 3 links -roll, pitch and yaw links- to connect with the 3 joints with the same name. Another importance is of the fixed virtual joint and the virtual gimbal link defined in the beginning of URDF. They are necessary to isolate the displacement of gimbal from the body for purely rotational gimbal movements. One may also want to remove this virtual link, joint pair and directly add the displacement to the continuous roll joint. I haven’t tried it since I like to sustain modularity as much as possible and I don’t know whether it would work. Finally camera joint attaches yaw link and camera link that simulates Gazebo ROS Camera sensor (here is a tutorial on Gazebo sensor plugins). The visual of the drone is only added in the last link to prevent the multiple visualization. That’s all for the robot description.

Implementing Gazebo Plugin as DJI M100 Emulator

LH CS(on left) and RH CS(on right)

If you are familiar with DJI ROS SDK, you should know there are a bunch of information received from various topics. For example we can receive real-time angular and linear velocities, linear acceleration and orientation of the drone by subscribing to /dji_sdk/imu topic. You can access the full list of topics in here. Actually, three topics are all was necessary to implement this plugin. All other topics might be used to add redundant features; but, subscribing to /dji_sdk/attitude, /dji_sdk/gps_position and /dji_sdk/gimbal_angle topics is adequate to emulate the real-time behavior in Gazebo, which is simulated in DJI Assistant.

Data received by gimbal angle topic can be used as is. However, attitude of the base and GPS position cannot be directly used. This is because DJI Assistant uses a Left-Handed coordinate system whereas Gazebo uses a Right-Handed coordinate system, note that both use Z axis as the Up vector. Therefore local conversions will be different such that X and Y axes should be negated both for orientation and position as below:

// Orientation
this->base_orientation.w = quat_msg->quaternion.w;
this->base_orientation.x = -quat_msg->quaternion.x;this->base_orientation.y = -quat_msg->quaternion.y;this->base_orientation.z = quat_msg->quaternion.z;...
// Position

Note that local position is computed with the GPS received directly from /dji_sdk/gps_position topic. Complete implementation can be found in the links I have given in the beginning of the post. Simply, the goal is to go to the exactly the same GPS point both in Gazebo and DJI Assistant. Having different local positions in the simulators is not important, which is the case in here.

Finally, all the received data is directly updated in OnUpdate function of the Gazebo plugin. OnUpdate is registered as a callback to event::Events::ConnectWorldUpdateBegin function of Gazebo Events API. With this registration, callback function is executed at each iteration of the simulation.

this->update_connection = event::Events::ConnectWorldUpdateBegin(boost::bind(&DJI_ROS_ControlPlugin::OnUpdate,this));

An important point is the velocity updates of the overall model. The nonzero components of linear and angular velocities should be eliminated as pose of the model changes. If you remove those lines, wherein velocities are reset at each iteration, you will see that the robot is slowly descending. I don’t know the exact reason of it, but probably the backend physics engine in Gazebo accumulates errors in velocities due to rapid pose changes.



Gimbal Manipulation with RQT GUI
Base Translation with RQT Gui
Base Rotation in Yaw direction with RQT GUI

You can enable all the three topics at once to see the movements in combination 😄. I have used RQT GUI for publishing to relevant topics instead of real DJI SDK in conjunction with an attached DJI M100, since I’m not allowed to do it online in the company. Nonetheless, it is pretty straightforward to understand that the topics will be filled with the proper information by the flight controller when the drone is connected to onboard computer. As I have mentioned, I have used Cao Chao’s existing work and he has a record of his work. The end result is the same, but also plus the gimbal.

DJI M100 in Gazebo controlled with RC


In this post, I have illustrated how the basic functionality of DJI Assistant can be extended with a Gazebo plugin. After enabling HITL in Gazebo, one can do any type of experimentation with the rich API of Gazebo. For example, thousands of models exist in Gazebo model repository, which allow very advanced scenarios to be crafted. Some example scenarios include:

  • Using Actors in order to simulate a trackable object.
  • Using gym-gazebo of Erle Robotics, which extends OpenAI API for Real-time Reinforcement Learning.
  • Simulating multiple robots using ROS_MASTER_URI in one simulation unit. So you can actualize multi-drone scenarios as well.
A sample world in Gazebo

Above is a snapshot from a simple scenario that I’ve created in 5 minutes. If you have any questions or have things to say for improvement, feel free to comment. Lastly, if you like the post and found it useful, give it a clap 👏



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Tahsincan Kose

Tahsincan Kose

Robotics Software Engineer. Doing my CS MSc in Middle East Technical University.