RACECAR: A Powerful Platform for Robotics Research and Teaching

Synced
SyncedReview
Published in
6 min readSep 5, 2017

Massachusetts Institute of Technology student autonomous robot project.

Outline:
Section 1:Introduction
Section 2:Course Description
Section 3:Images
Section 4:Hardware Overview
Section 5:Software OverView
Section 6:Algorithm introduction
Section 7:Conclusion

Introduction

The origin of the robotic mini-car race is an MIT’s robotics course called “Robotics: Science and Systems” (6.141/16.405). Its goal is teaching robotics with the RACECAR platforms.

RACECAR is a Powerful Platform for Robotics Research and Teaching. It is one of the greatest university courses in this field.

Through a small video, you can easily understand the magic of the race car. (https://mit-racecar.github.io)

Course Description

“We will design and implement perception and planning algorithms for cars that can quickly navigate through complex environments. The class will be divided into six teams. Each team will be given one RC race car, powered by an NVIDIA Jetson embedded supercomputer as well as an inertial measurement unit, a visual odometer, a laser scanner, and a camera. We will teach not only the basics of perception and planning algorithms, but we will also show the participants how to run the Robot Operating System (ROS) on the NVIDIA platform while interfacing with sensors and actuators. At the end of the course, we will race through the MIT tunnels to determine the winning team!”

Here’s a link to the Github repositories for the design of the hardware and software for the car.

Images

Here are some images of the RACECAR:

Hardware Overview

Overall, the design of the car is straightforward. Most of the hardware used is available off the shelf. Here’s a list of the major components:

  • The R/C Car — Traxxas Rally 7407
  • On board computer — NVIDIA Jetson TK1
  • 2D LIDAR — Hokuyo UST-10LX
  • Camera — Point Grey Firefly MV
  • Battery for electronics — Energizer XP8000AB

A lot of the electronic components come from Sparkfun, such as opto-isolator board, Razor 9DOF IMU.

The structure of the vehicle is augmented by acrylic platforms to mount the sensors and electronics, along with some 3D printed parts for the overall structure itself.

An optical flow visual odometer, a PX4FLOW, is mounted on the top platform. However, in practice, the device was not used very much because it did not provide sufficient resolution for the environment where the cars were operating.

There is one custom built electronic part on the vehicle. A custom circuit board connects to the Jetson J3 header, which provides access to the Jetson GPIO signals, and adds a real time clock and an opto-isolator. The GPIO access is used to send PWM signals to the vehicle’s servos and motors.

Detailed hardware specifications and assembly instructions are available here:
https://github.com/mit-racecar/hardware/blob/master/racecar-2.0/drawing_package.pdf

Software modules

The software is based on the Robot Operating System (ROS) running on the Jetson.

ROS nodes for the Hokuyo LIDAR and other sensors are part of the software package given to students.

Students integrate existing software modules (drivers for reading sensor data, and their custom algorithms) to quickly develop a complete autonomous system.

The programming language used is mainly C++ and Python.

Algorithm Introduction

Algorithm list in https://github.com/mit-racecar/TA_example_labs;
Racing positioning algorithm, students need to write

  • Ta_lab3: Wall to follow:

Follow a wall with a single IR sensor and two motors on the robot. RaceCar use an IR ranging sensor for following the wall.

See the tutorial on sensors for more about the comparison between IR and sonar sensors.

A wall following method presented in uses IR sensors and divides the environment into a number of cells representing the wall as set of lines, points, etc.

Another wall following algorithm presented uses image processing to extract features like edges and corners after capturing the images or videos through a camera by using background subtractions method, which is fed to the control systems to drive the robot.

This is the way of car avoids obstacles.

  • Ta_lab4: Visual Servo:

Visual servo is a rapidly maturing approach to the control of robot manipulators that is based on visual perception of robot and wall location.

More concretely, visual servo involves the use of one or more cameras and a computer vision system to control the position of the robot’s end-effector relative to the wall as required by the task.

This is the way of car avoids obstacles.

  • Ta_lab5: Particle Filter Location:

In simple terms, the particle filter method obtains the minimum variance distribution of the state, by looking for a set of random samples propagating in the state space to approximate the probability density function, and using the integral operation to obtain the mean.

Your robot has been kidnapped and transported to a new location! Luckily, it has a map of this location, a (noisy) GPS estimate of its initial location, and lots of (noisy) sensor and control data.

Your particle filter will be given a map and some initial localization information (analogous to what a GPS would provide). At each time step your filter will also get observation and control data.

This is the way of car avoids obstacles.

  • Ta_lab6: pure pursuit + A * trajectory planning

Pure pursuit is a tracking algorithm that works by calculating the curvature that will move a vehicle from its current position to some goal position.

The whole point of the algorithm is to choose a goal position that is some distance ahead of the vehicle on the path.

The goal is to autonomously navigate and drive the robot along the path by continually generating speed and steering commands which compensate for the tracking errors, which mainly consist of vehicle’s deviations in distance and heading from the path.

This is the way of car avoids obstacles.

Conclusion

The work of MIT students in this robot related course is to optimize the algorithm, which is a great project to train the students in the field of intelligent and sensory systems.

From a teaching point of view, this course has a lot of educational value. It is a practical course, while inspiring students’ enthusiasm for learning. However, its shortcomings are also obvious, students just gain a little robotic programming skills and hardware configuration abilities, and nothing deeper.

The MIT RaceCar is not for commercial use.

References

[1] Particle Filter Location http://www.cnblogs.com/rubbninja/p/6220284.html

[2] Kalman filter and its use http://www.cnblogs.com/rubbninja/p/6220284.html

[3] https://en.wikipedia.org/wiki/Visual_servoing

[4] https://mit-racecar.github.io

[5] http://www.instructables.com/id/Wall-Following-Robot-Car/

[6] http://forums.trossenrobotics.com/tutorials/how-to-diy-128/following-a-wall-3283/

[7] https://wenku.baidu.com/view/ae1e928aa8956bec0875e3b5.html

[8] https://web.wpi.edu/Pubs/E-project/Available/E-project-042513-124910/unrestricted/MQP_Report.pdf

[9] VISUAL CONTROL OF ROBOTS Peter I. Corke http://www.petercorke.com/bluebook/book.pdf

[10]Sebastian Thrun http://robots.stanford.edu/papers/thrun.pf-in-robotics-uai02.pdf

[11] https://github.com/ksakmann/Particle-Filter

[12] https://www.cs.princeton.edu/courses/archive/fall11/cos495/COS495-Lab7-Localization.pdf

[13] http://www.ri.cmu.edu/pub_files/pub3/coulter_r_craig_1992_1/coulter_r_craig_1992_1.pdf

[14] http://www.cs.jhu.edu/~hager/Public/Publications/TutorialTRA96.pdf

Author: Jie Zhu | Editor: Zhen Gao | Localized by Synced Global Team: Xiang Chen

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global