How Udacity’s Self-Driving Car Students Approach Behavioral Cloning
Udacity believes in project-based education. Our founder, Sebastian Thrun, likes to say that you don’t lose weight by watching other people exercise. You have to write the code yourself!
Every module in the Udacity Self-Driving Car Engineer Nanodegree Program builds up to a final project. The Deep Learning Module culminates in one of my favorite—Behavioral Cloning.
The goal of this project is for students to build a neural network that “learns” how to drive a car like a human. Here’s how it works:
First, each student records his or her own driving behavior by driving the car around a test track in the Udacity simulator.
Then, each student uses this data to train a neural network to drive the car around the track autonomously.
There are all sorts of neat ways to approach this problem, and it seems like Udacity students tried all of them! Here are excerpts from—and links to—blog posts written by five of our Self-Driving Car students, each of whom takes a different approach to the project.
Training a Self-Driving Car via Deep Learning
James Jackson’s post is a great overview of how to approach this project, and he adds a twist by implementing data smoothing. We didn’t cover data smoothing in the instructional material, so this is one of many examples of Udacity students going above and beyond the instructional material to build terrific projects.
“Recorded driving data contains substantial noise. Also, there is a large variation in throttle and speed at various instances. Smoothing steering angles (ex. SciPy Butterworth filter), and normalizing steering angles based on throttle/speed, are both investigated.”
Behavioral Cloning
This is a terrific post about the mechanics of building a behavioral cloning model. It really stands out for JC’s investigation of Gradient Activation Mappings to show how which pixels in an image have the most effect on the model’s output.
“The whole idea is to using heatmap to highlight locality areas contributing most to the final decision. It was designed for classification purpose, but with slight change, it can be applied to our steering angle predictions.”
Behavioural Cloning Applied to Self-Driving Car on a Simulated Track
This post has a great discussion of data augmentation techniques for neural network training, including randomly jittering data from the training set. Joshua used over 100,000 images for training!
“Though there was more than 100,000 training data, each epoch consisted of 24,064 samples. This made the training more tractable, and since we were using a generator, all of the training data was still used in training, however at different epochs.”
Self Driving Car — Technology drives the Future !!
Sujay applied a number of different augmentations to his training data, including brightness and shadow augmentations. This helped his model generalize to a new, darker test track.
“The training samples brightness are randomly changed so as to have training data that closely represent various lighting conditions like night, cloudy, evening, etc.”
You don’t need lots of data! (Udacity Behavioral Cloning)
This post encourages students by showing how it’s possible to build a behavioral cloning model without tens of thousands of training images. The secret is to use side cameras and data augmentation.
“Just like anything we do, the longer we practice, the better we are good at it because we take in hour and hour of data into our brain memory/muscle memory. It’s the same here for neural net, the more variety of data you have to train your network, the better the model is at the task.”
As you can see from these examples, there is no one right way to approach a project like this, and there is a great deal of room for creativity. What should also be clear is that our students are incredible!
We’re very excited about the next projects on the horizon, and we look forward to sharing more amazing student work with you soon!