Autonomous RC Car Part 1

Mikkel Wilson
5 min readFeb 4, 2019

--

In Part zero we discussed ‘why’. Now we’ll talk about ‘what’.

Chassis

My son builds complicated RC cars. They have hundreds of parts and require hours to careful assembly. If you like Legos it’s a great step up. Just don’t expect the kits to come cheap. This isn’t a $30 toy from Target, these cars can easily reach 40mph and with tuning can reach 80mph.

They have full suspension, front wheel steering, 4 wheel drive with front and rear differentials, and brushed electric motors. Many robot platforms come with stepper motors which are much easier to calibrate and determine how many rotations actually occurred. This distance-traveled information is really useful when implementing SLAM (more about that later) and having front-wheel drive as opposed to tank-style drive systems mimics a full-sized AV more closely. The errors that are introduced by the suspension, differentials, pneumatic tires, and steering calibration will also affect the PID motion control equations we’ll have to implement later.

Tamiya TT-02 chassis

Not pictured here, but this model also comes with standoffs which will make mounting the computer hardware easier.

Telemetry & Telematics

I did some light reading and discovered that the RC receiver units all just spit out Pulse Width Modulated signals. PWM is really easy to produce with an Arduino (the Pulse example project is usually the 2nd thing everyone does with an Arduino). It should be trivial to receive the PWM signals from the RC receiver, convert into … something … send it … somewhere … and then resend the signal to the steering servos and ESC (Electronic Speed Controller — the ‘throttle’ for this project).

Arduino Nano running the Pulse example program.

The ‘something’ and ‘somewhere’ are pretty easy to factor in. We need a camera, some place to store images, and a computer to process them and, eventually, run the AI. Something faster than an Arduino would be great. The NVIDIA Jetson TX2 would be awesome, but at $479 that kind of blows up the BoM. Seems like a good job for your favorite $35 computer, the Raspberry Pi 2 B+ and the Raspberry Pi Camera Module. Since I already have several of these, I’m using them. I’m expecting an RPi 3 will be an early upgrade.

You might ask: Why not just record and replay the PWM signals with the Raspberry Pi directly? It has GPIO pins, right? Well, RPis run Linux, which isn’t a real-time operating system. There’s a ‘jitter’ that can be introduced into PWM signals from Raspberry Pi GPIO pins that makes it unsuitable for running things like stepper motors and servos. So we’ll use the Arduino to handle the PWM signal input/output, and the RPi to handle the video processing (OpenCV), recording, and running the AI (Karas).

Next we need some way for the Arduino and Raspberry Pi to communicate. We know PWM is out, but I2C is a good wire protocol for reliably sending small bits of data and is well supported by RPi’s GPIO and Arduino. Also, it could support multiple devices if we need to expand in the future. So, our ‘something’ will be an invented wire protocol that encodes PWM channels and sends them across I2C. This is effectively the CAN bus for our car.

LiDAR

So far with a Raspberry Pi, Arduino, and a Tamiya TT-02 RC car chassis, we’ve basically just replicated the capability of the DeepRacer with an added remote drive capability with an RC transmitter.

As much as I’d love a Velodyne Alpha Puck, it’s simply out of my price range for this project. There are two low-cost LiDAR units currently available: The Neato VX-11 and the RPLIDAR A1.

Neato XV-11 on the left. RPLIDAR A1 on the right.

These LiDAR units do have some differences from the ones used in full-sized AVs. Most notably, they’re only 2D. I’ll get a single swath of light passed around the robot. Anything that’s smaller than the height of the sensor will go undetected. There may be a way of mounting the LiDAR at an angle toward the direction of travel to account for this, but I’ll have to experiment when it’s actually built.

Another important difference is that these LiDAR units don’t determine distance the same way as the Velodyne or Pulse systems. The big commercial units are Time of Flight LiDAR systems — they send pulses of light and count the time delay until the pulses are detected, multiply by the speed of light in air and you have an accurate distance to that point. These low cost units use ‘triangulation’ to determine distance. Light up a laser, see where it’s detected, and use the incident angle and a little trigonometry to determine distance.

The Neato XV-11 is a replacement part for a robot vacuum. Think Roomba with lasers. The LiDAR unit is not exactly a product on its own so don’t expect commercial support or fancy documentation. Most of its operation is reverse engineered and published for other hobbyists to use and expand upon. Casual reading does indicate that the software works and users have generated point clouds with it. It’s great advantage is that it’s cheap. You can find them on eBay for < $60.

The RPLIDAR A1 is a bit more expensive at around $99. The stats are relatively similar to the Neato XV-11 but this may be my naivety showing. I’m tempted to go with the RPLIDAR due to commercial backing and the $40 difference isn’t dramatic when I’m only building one of these.

Assembly

Next steps are to get components working together, capture some telemetry, and build a pipeline for training data. See you in the next section.

Update: Continue to Part Two.

--

--