Autonomous RC Car Part 0

Mikkel Wilson
3 min readJan 31, 2019

--

TL;DR — This is part zero of a multi-part series on building a 1/10th scale autonomous car.

I’m building a robot.

Why?

Autonomous Vehicles (AVs) are already driving on roads. There are several approaches and technology stacks being advanced by some of the most well funded organizations on the planet (Google, GM Cruise, Tesla), startup media darlings (Uber, Lyft, Plus.ai), and a constellation of technology providers (Here, Carmera, Velodyne, NVIDIA, Quanergy). With any transformational technology comes fear, misunderstandings, doubt, hype, and sometimes consensus. At present, the differing technical approaches aren’t showing a clear winner. Tesla’s radar-based systems have been implicated in several deaths. Uber’s LiDAR-based system was involved in a fatality in Arizona. Which approach will prove to be better? I would like to personally be more well informed so I’m going to build one.

Why not just buy one?

Amazon’s DeepRacer

I did look at the Amazon DeepRacer but I’m not going to buy one. Three reasons:

  1. It’s not going to be released until March (pre-release versions are being sold on eBay however)
  2. It’s generally focused on reinforcement learning. This isn’t how self-driving AIs are being taught in the wild (we think).
  3. The DeepRacer is an optical-only system. While it’s quite capable, it’s not the only hardware in big-boy AVs.

While all the major players have optical camera systems they also incorporate other sensors. Tesla uses a radar/sonar array, Cruise uses solid state LiDAR (thanks to an acquisition of Strobe), Google’s Waymo, Lyft, and Uber all use Velodyne’s spinning LiDAR units.

There are other commercial / academic robots available like the Roomba Create 2 or the Board of Education Arduino robot project (which I already own). These tend to be tank-drive type of bots which accurately represent the challenges inherent in a full-size AV. In short, that would be too easy.

What do you need?

With the exception of George Hotz, all the AV players are keeping their AI and self-driving algorithms very close secrets. The prevailing wisdom is that they all use some system that mimics how human drivers drive. The AI community refers to this as Behavioral Cloning.

Udacity’s Self-Driving Nanodegree program trains with simulators. Note the steering angle in the upper left, and throttle on the bottom right.

Behavioral Cloning is a Feed-Forward Neural Network process. It requires both the input signals (video, sensors, accelerometers) and the output (steering angle, throttle) to train the AI.

Fun fact: If you own one of a few models of Honda, GM, or Hyundai, you can gather the steering angle and throttle (gas pedal) data from your car’s OBDII port. Combine that with timestamped video feed and you could clone your own driving behavior.

To train this AI we’ll need to know the steering position and throttle sent from the RC transmitter (controller). Additionally, we’ll need to put a camera on board the car and some way of recording all the telemetry.

So, what are you building?

No plan survives first contact with the enemy.” — Helmuth von Moltke the Elder

I’ve got a rough idea of where this will start but I’m sure it will change along the way. In the next section we’ll talk about the platform we’re building on and parts of the electronics.

Update: Continue to Part One.

--

--