Introducing the EZ-RASSOR
Today we are proud to announce the release of the EZ-RASSOR, an educational robotics software platform built by students at UCF (The University of Central Florida) in conjunction with the Florida Space Institute and NASA Swamp Works engineers at the Kennedy Space Center!
What Did NASA Want?
This project came to us with several major requirements:
- Develop manual and autonomous controls for a mining rover currently under construction at NASA.
- Create a simulated environment and a simulated rover to test these controls.
- Build an autonomous loop that enables the rover to act independently, without human intervention.
Sounds simple enough, right?
What Did We Deliver?
We built several components for this project that operate semi-independently. I’ll start by discussing our framework, the Robot Operating System.
The Robot Operating System
The Robot Operating System (ROS) is potentially the most important piece of this project. ROS isn’t that kind of operating system. Instead, I like to think of it as a system that operates robots via standardized messaging and a linked graph of interdependent nodes.
ROS uses a publisher/subscriber system to transport messages through “topics” between separate processes (“nodes”). These nodes are programs that perform all of the actual functions of the system, like applying force to a joint, processing image data from the cameras, or even starting up and running the entire simulation.
ROS is the foundation of all of the features we implemented for the EZ-RASSOR. Movement controls are implemented by mapping gamepad buttons to certain topic data outputs using the Joy node and a custom translator. Autonomous functionality is implemented by routing camera data to our autonomous controller via a series of topics. Even our mobile app controller indirectly relies on ROS to pass its HTTP POST requests into the rest of the system.
To help visualize the flow of data through our software suite, here’s a graphic of our current ROS graph. The ovals represent ROS nodes and the arrows indicate the movement of information through ROS topics:
Every Project Needs an App
A big, complex system is useless if it can’t be controlled! Our mobile app controller is one of the main ways end-users interact with our software. Written in React Native for Android and iOS, our app uses HTTP POST requests to send commands to our ROS graph indirectly via a controller server that runs locally on the rover. Our mobile app doesn’t use ROS at all, which means it is quite small and lacks any spooky dependencies (we would know, installing ROS on a mobile device is not fun).
Creating a Simulation
A big, complex system is also useless if it has nothing to control! We built a simulated environment and model of the robot using Gazebo and Blender to test our control systems. For most of the development process, our team did not have the final parameters of the EZ-RASSOR available, so we had to fill in the blanks and use our imagination when creating the model. We did know that the EZ-RASSOR would be based on NASA’s existing RASSOR rover, and we also knew some of the basic parameters listed below:
The EZ-RASSOR will consist of four wheels, two drums, two drum arms, an inertial measurement unit sensor, and a front-facing stereo camera.
Using Blender, a popular open source 3D modeling tool, we created our model, shown below on the left.
Autonomous Control
Two of the coolest, and most difficult, features that our team implemented were obstacle detection and obstacle avoidance. Gazebo supports a stereo camera plugin, so through this plugin we can feed simulated stereo camera data directly into our autonomous controller. Using this data, we are able to add obstacle detection and avoidance to our project.
Python to the rescue!
ROS Kinetic and ROS Melodic (the ROS versions our software targets) support creating nodes with Python. This means that our autonomous controller node has the full power of Python’s robot vision libraries at its disposal for parsing and understanding camera data.
Using the OpenCV library we are able to generate a disparity map from the cameras and then use “mean pooling” to determine when our robot needs to avoid an obstacle and which direction it needs to orient to stay safe. Our robot is able to navigate to any target digsite on a grid, fill its drums with regolith at the target digsite, return to base and dump that regolith, then repeat the process again and again, all while avoiding obstacles! Additionally, the IMU sensor data enables the robot to detect when it’s fallen on its side so that it can attempt to self-right, which is demonstrated below.
Our Software is EZ to Use
We want people to install and use our software without needing to learn all of the minutiae of ROS or any of the other technologies that we employ. We wrote several scripts to ensure this is possible. With just a few commands, you can fully install our software and all of its dependencies.
Once you’ve installed the software, how do you use it? All of the components of our system are run using launch files. These XML files define which nodes ROS must start to accomplish some goal, such as starting the communication system or the simulation. There’s a launch file to start the simulation, another to spawn the model, another to start our autonomy scripts, and on and on and on. To simplify the experience, we created two top-level launch files:
- configurable_communication.launch
- configurable_simulation.launch
The former starts all of the communication packages needed to run our software on a real robot or in the simulation, and the latter fires up the simulation. Our launch files support a wide range of arguments that configure what they should do, hence their names.
Understanding the parameters of these two launch files is all that end-users really need to know to use our software.
What’s Next?
This release is just the beginning of the EZ-RASSOR. We have many exciting updates planned, including a project upgrade to ROS 2, inclusion of the ROS Navigation Stack for more advanced autonomous functions, creation of a massive GUI dashboard to monitor the rover, and the release of our mobile app on app stores. This project is also going to be actively developed this coming year by several more teams at UCF. For now, we want to share our project’s current state publicly for the first time and get feedback from the community.
To explore our repository, head on over to github.com/FlaSpaceInst/EZ-RASSOR.
TL;DR
A team of software engineers at UCF created a robotics software platform and simulation for an educational mining rover that NASA is currently building.
Special Thanks!
Mike Conroy with the Florida Space Institute who served as our project sponsor from its inception.
Dr. Mark Heinrich from UCF who provided guidance throughout this entire process.
Rob Mueller, Kurt Leucht, Jason Schueler, and all of the NASA Swamp Works Engineers who welcomed us into their lab to scope out requirements and share ideas.
All of the EZ-RASSOR Developers. I will be forever grateful for the team’s expertise, commitment, and passion. We did it guys!
