Building an Autonomous Car using a 1/10th Scale RC Car — Part 2

Eric Maggard
10 min readSep 19, 2017

--

Part 1 went through setting up ROS Kinetic, the catkin workspace, and installing the necessary standard packages on a Raspberry Pi3. The writeup can be found here: https://medium.com/@ericmaggard/building-an-autonomous-car-using-a-1-10th-scale-rc-car-part-1-4474706d02b5

Part 2 will go into setting up the sensors and publishing the data to ROS. I will also download and build some packages that we might need to modify in the future.

The sensors/controllers to be covered are the following:

Note: the Adafruit 10 DOF is discontinued and is being replaced by: https://www.adafruit.com/product/1604 I am purchasing one and going to test it against the 10 DOF sensor that I currently have. If it is better, which I am thinking it is, then I will be replacing it with the new one.

Downloading and Building Packages

In the future, I might want to modify some packages. Therefore, I am going to download the source code and build them under the catkin workspace. To do that, we need to download each of the packages from the github repositories below:

imu_compass: https://github.com/clearpathrobotics/imu_compass

robot_upstart: https://github.com/clearpathrobotics/robot_upstart

ros_arduino_bridge: https://github.com/hbrobotics/ros_arduino_bridge

usb_cam: https://github.com/ros-drivers/usb_cam

xv-11_laser_driver: https://github.com/rohbotics/xv_11_laser_driver

After downloading the source code, unzip the files into the catkin_ws/src directory. After all of the files are expanded, open a Terminal window and cd to catkin_ws. In the catkin_ws directory, run the following commands to build and install the new libraries:

$ catkin_make
$ catkin_make install

I actually got an error compiling the usb_cam package and it turned out there were other ROS packages previously loaded on the image. I removed all of the ROS installed libraries and reloaded the ROS base package and other packages like I did in Part 1. After doing that, all of the packages compiled correctly.

Logitech Game Controller

I know I need to control the robot away from home and my WiFi network. Therefore, I need to have a controller that can work away from my home network and around my subdivision. I thought about setting up a local network, but that is more power draw and too much that I want to put on the robot. I am therefore going to try a Logitech Gamepad F310 controller that I currently have. To get it working with ROS, I installed a couple of packages:

$ sudo apt-get install ros-kinetic-joy
$ sudo apt-get install ros-kinetic-teleop-twist-joy

The ros-kinetic-joy is the straight joystick controller for ROS and directly outputs the values for all of the buttons and joysticks. You need to translate those values to velocity and turn angle. The ros-kinetic-teletop-twist-joy is the publisher that can translate the joystick values to velocity and twist angle. These can be taken directly and sent to the motor and steering servos.

Start the joystick node by entering the following command:

$ rosrun joy joy_node

This will start the joystick node and publish data to the /joy topic. To view the topics that are being published, you can enter the command ‘rostopic list’ in the Terminal window. After verifying the /joy topic is being published, you can see the messages that are being published by entering the following command in a new Terminal window:

$ rostopic echo /joy

Now, once the joystick node is running, start the teleop_twist_joy node so the joystick values can be translated to velocity and angle. To start the node, enter the following in a new Terminal window:

$ rosrun teleop_twist_joy teleop_node

To then listen to the twist messages being posted, enter the following into a Terminal window:

$ rostopic echo /cmd_vel 

Here are examples of the output from the joy publisher node and then teleop_twist_joy publisher node.

Listening to rostopic /joy
Listening to rostopic /cmd_vel

Logitech C920 Webcam

The webcam can be plugged into any one of the USB slots in the Raspberry Pi3. In order to check and verify that the camera is connected and working let’s verify a couple of things. First, open an application that accesses the webcam. There should be either ‘cheese’ or ‘vlc’ preloaded on the system. If they are not on the system, vlc can be downloaded by entering the command in the Terminal window:

$ sudo apt-get install vlc

After the webcam is verified that it is working on the system, check to see what device it is connecting to. Usually the webcam is /dev/video*, where the * is mostly likely 0, but could be 1, 2, or higher depending on things connected to the system. Mine webcam was /dev/video0. You can start the video streaming by running the following commands:

$ roscore

Then in another Terminal window:

$ rosrun usb_cam usb_cam_node

XV-11 Neato LIDAR

I purchased a XV-11 LIDAR system from GetSurreal. The nice thing about it was it is all in a self contained box with a Teensy controller. I only had to connect the LIDAR with a mini USB cable and start the xv11 laser node. To do that, run the following command:

$ rosrun xv_11_laser_driver neato_laser_publisher _port:=/dev/ttyACM0 _firmware_version:=2

You can view the published topic by entering the following command in a new Terminal window.

$ rostopic echo /scan

This displays the raw values as shown in the image below. The next section will show how to visualize the data in RViz on a separate computer, which will be good for offloading image processing or other computation if necessary.

LIDAR scan values published to the /scan topic

Viewing published topics with RViz

After setting up the camera and the Neato LIDAR, let’s test connecting to the system from another computer. To do this, you will need to setup exports in the ~/.bashrc file. Get the IP address of the Raspberry Pi. Enter this IP address into the ~/.bashrc file on the Raspberry Pi by entering the following command:

$ echo "export ROS_IP=192.168.0.12" >> ~/.bashrc

Note that you will need to enter the actual IP from the Raspberry Pi into that command. Now get the IP address of your main system, and enter the following lines into the Terminal on your main system:

$ echo "export ROS_IP=192.168.0.16" >> ~/.bashrc
$ echo "export ROS_MASTER=192.168.0.12" >> ~/.bashrc
$ echo "export ROS_MASTER_URI=http://192.168.0.12:11311" >> ~/.bashrc

After the bashrc file has been updated, start a new Terminal window and run the following command, but make sure the ROS nodes for the LIDAR and webcam are still running on the Raspberry Pi3:

$ rosrun rviz rviz

This will bring up the RViz visualization software. In the interface, click on the “Add” button and open the “Image” and “LaserScan” topics that should be available under the “By Topic” tab. After adding those topics, the interface should look similar to the image below, where the video stream is in the lower left, and the LIDAR data is shown in the center map area.

Now you know you can connect to the master computer (the Raspberry Pi) and get ROS topics over your network. This will be handy in the future if you want to do image processing off of the Raspberry Pi and just send control data back.

Adafruit GPS module w/antenna

I followed the setup instructions: https://learn.adafruit.com/adafruit-ultimate-gps-on-the-raspberry-pi/using-uart-instead-of-usb

You will first need to setup the UART on the Raspberry Pi in order to communicate with the Adafruit GPS module. To do that, you need to modify the /boot/cmdline.txt file by entering:

$ sudo nano /boot/cmdline.txt

Then, change the line to:

dwc_otg.lpm_enable=0 console=tty1 root=/dev/mmcblk0p2 rootfstype=ext4 elevator=deadline fsck.repair=yes root wait

You then need to make a change to the /boot/config.txt file by running:

$ sudo nano /boot/config.txt

At the last line of the file, add the line:

enable_uart=1

Now you need to reboot the system and enter the following lines in a Terminal window:

$ sudo killall gpsd
$ sudo gpsd /dev/ttyS0 -F /var/run/gpsd.sock

Before continuing, we need to connect the GPS module to the Raspberry Pi 3 GPIO pins. Connect the Adafruit Ultimate GPS module to the UART0 lines. The following diagram shows how to connect the module:

Connecting the Adafruit Ultimate GPS module to the UART0 Lines

You can now try to test the GPS by entering the command in the Terminal:

$ cgps -s

This did not work for me, so I found a work around:

$ sudo nano /etc/default/gpsd

I changed the lines to the following variables:

START_DAEMON=“true”
GPSD_OPTIONS=”-n”
DEVICES=”/dev/ttyS0"
USBAUTO=”true”
GPSD_SOCKET=”/var/run/gpsd.sock”

After rebooting the system and running the ‘cgps -s’ command it worked. Now to get setup a GPS publisher to put the data out on a ROS message. First let’s get a parser for the NMEA GPS message string:

$ pip install pynmea2

I created a new package for my navigation control by using the following ROS tutorial: http://wiki.ros.org/ROS/Tutorials/CreatingPackage. After creating the package, I compiled catkin_ws again by using the commands:

$ catkin_make
$ catkin_make install

Now to setup the ROS publisher for the GPS data. Here is a basic ROS tutorial on Publishers and Subscribers: http://wiki.ros.org/rospy_tutorials/Tutorials/WritingPublisherSubscriber

The code for publishing the GPS data is in the section below, saved in a file called NavSatFix_publisher.py. Save the file to the scripts folder of the package you created above.

#!/usr/bin/env python
import rospy
import serial
import pynmea2
from sensor_msgs.msg import NavSatFix
s = serial.Serial('/dev/ttyS0', 9600)
if s.isOpen() is False:
sys.exit(1)
def gps_talker():
pub = rospy.Publisher('navsat', NavSatFix, queue_size=10)
rospy.init_node('gps_talker', anonymous=True)
msg = NavSatFix()
seq = 0
while not rospy.is_shutdown():
line = s.readline()
data = pynmea2.parse(line)
if line[1:5] == 'GPRM':
seq += 1
msg.header.seq = seq
msg.header.stamp = rospy.Time.now()
msg.header.frame_id = 'world'
if data.status == 'A':
msg.status.status = int(1)
else:
msg.status.status = int(0)
msg.status.service = int(0)
msg.latitude = float(data.lat)
msg.longitude = float(data.lon)
# data.true_course=142.46
# data.spd_over_grnd
if line[1:5] == 'GPGG':
seq += 1
msg.header.seq = seq
msg.latitude = float(data.lat)
msg.longitude = float(data.lon)
msg.altitude = float(data.altitude)*3.28084
pub.publish(msg)
if __name__ == '__main__':
try:
gps_talker()
except rospy.ROSInterruptException:
pass

After the file is created, you will need to change the permissions so the file can be run as an executable. Now to run the file, enter the rosrun command, where robot_nav should be replaced by the package you created:

$ rosrun robot_nav NavSatFix_publisher.py

To listen to the /navsat topic, enter the following into a Terminal window:

$ rostopic echo /navsat

You should see the topic print something similar to the window shown below:

Message from the NavSatFix Publisher

Adafruit 10DOF sensor

In order to use the 10DOF and read through the i2c, I leveraged the documentation from: http://ozzmaker.com/i2c/

Here is the schematic for wiring Adafruit 10 DOF using the I2C1 Lines:

Wiring for Adafruit 10 DOF breakout board

I used the library RTIMULib for communication with 10 DOF: https://github.com/richards-tech/RTIMULib2 To build and install the library, cd to the /RTIMULib2-master/Linux/python directory and run the following two lines:

$ sudo python setup.py build
$ sudo python setup.py install

The code for publishing the IMU data is in the section below, saved in a file called Imu_publisher.py. Save the file to the scripts folder of the package you created in the previous section.

#!/usr/bin/env python
import rospy
import sys
sys.path.append('.')
import RTIMU
import os.path
import time
import math
from sensor_msgs.msg import Imu
SETTINGS_FILE = "RTIMULib"
s = RTIMU.Settings(SETTINGS_FILE)
def imu_talker():
# setup publisher and classes
pub = rospy.Publisher('imu', Imu, queue_size=10)
rospy.init_node('imu_talker', anonymous=True)
rate = rospy.Rate(10) # 10hz
msg = Imu()
imu = RTIMU.RTIMU(s)
if (not imu.IMUInit()):
sys.exit(1)
# set parameters
imu.setSlerpPower(0.02)
imu.setGyroEnable(True)
imu.setAccelEnable(True)
imu.setCompassEnable(True)
seq = 0
while not rospy.is_shutdown():
if imu.IMURead():
data = imu.getIMUData()
seq += 1
msg.header.seq = seq
msg.header.stamp = rospy.Time.now()
if data['compassValid'] == True:
msg.orientation.x = data['compass'][0]
msg.orientation.y = data['compass'][1]
msg.orientation.z = data['compass'][2]
msg.orientation.w = 0
msg.angular_velocity.x = data['accel'][0]
msg.angular_velocity.y = data['accel'][1]
msg.angular_velocity.z = data['accel'][2]
if data['gyroValid'] == True:
msg.linear_acceleration.x = data['gyro'][0]
msg.linear_acceleration.y = data['gyro'][1]
msg.linear_acceleration.z = data['gyro'][2]
pub.publish(msg)
rate.sleep()
if __name__ == '__main__':
try:
imu_talker()
except rospy.ROSInterruptException:
pass

From the RTIMULib2/Linux/python/tests directory, copy the RTIMULib.ini file to the folder where the publisher file is saved. This file will be read and setup the class in the line: s = RTIMU.Settings(SETTINGS_FILE). Then, make sure you change the permissions so the file can be executed. Now, start the IMU publisher:

$ rosrun robot_nav Imu_publisher.py

You should see the topic print something similar to the window shown below:

Message from the IMU Publisher

Now, all of the sensors I am planning on using are up and running on the Raspberry Pi3. I haven’t been able to get a Movidius Compute Stick, so I am going to duplicate setting up ROS and the sensors on the Jetson TX-1 board.

Part 3

I have been rethinking about using the Logitech controller for manually navigating the robot around the subdivision. What I am going to try is to use a Teensy 3.2 board and have it monitor the RC signals from the RC controller. If it is in “Manual” mode, then the Teensy will reproduce those RC PWM signals to the motor and the steering. If it is in “Autonomous” mode, then the Teensy will take the input from the Raspberry Pi3 and send those PWM signals to the motor and steering. Therefore, the car will be not be tethered with me following close behind.

So, in the next part, I will see how well a Teensy can duplicate RC PWM signals.

Truck Build

I also just received the Traxxas Slash truck and am starting to print 3D parts to hold the sensors. I will do the writeup for the build and wiring in Part 4.

--

--