Term 2: In-Depth on Udacity’s Self-Driving Car Curriculum

David Silver
Udacity Inc
Published in
4 min readFeb 14, 2017

Update: Udacity has a new self-driving car curriculum! The post below is now out-of-date, but you can see the new syllabus here.

The very first class of students has finished Term 1 of the Udacity Self-Driving Car Engineer Nanodegree Program! We are so excited by their accomplishments—they have built traffic sign classifiers, end-to-end neural networks for driving, lane-finding algorithms, and vehicle tracking pipelines.

Now it’s time for Term 2 — hardcore robotics.

The focus of Term 1 was applying machine learning to automotive tasks: deep learning, convolutional neural networks, support vector machines, and computer vision.

In Term 2, students will build the core robotic functions of an autonomous vehicle system: sensor fusion, localization, and control. This is the muscle of a self-driving car!

Term 2

Sensor Fusion

Our terms are broken out into modules, which are in turn comprised of a series of focused lessons. This Sensor Fusion module is built with our partners at Mercedes-Benz. The team at Mercedes-Benz is amazing. They are world-class automotive engineers applying autonomous vehicle techniques to some of the finest vehicles in the world. They are also Udacity hiring partners, which means the curriculum we’re developing together is expressly designed to nurture and advance the kind of talent they would like to hire!

Lidar Point Cloud

Below please find descriptions of each of the lessons that together comprise our Sensor Fusion module:

  1. Sensors
    The first lesson of the Sensor Fusion Module covers the physics of two of the most import sensors on an autonomous vehicle — radar and lidar.
  2. Kalman Filters
    Kalman filters are the key mathematical tool for fusing together data. Implement these filters in Python to combine measurements from a single sensor over time.
  3. C++ Primer
    Review the key C++ concepts for implementing the Term 2 projects.
  4. Project: Extended Kalman Filters in C++
    Extended Kalman filters are used by autonomous vehicle engineers to combine measurements from multiple sensors into a non-linear model. Building an EKF is an impressive skill to show an employer.
  5. Unscented Kalman Filter
    The Unscented Kalman filter is a mathematically-sophisticated approach for combining sensor data. The UKF performs better than the EKF in many situations. This is the type of project sensor fusion engineers have to build for real self-driving cars.
  6. Project: Pedestrian Tracking
    Fuse noisy lidar and radar data together to track a pedestrian.

Localization

This module is also built with our partners at Mercedes-Benz, who employ cutting-edge localization techniques in their own autonomous vehicles. Together we show students how to implement and use foundational algorithms that every localization engineer needs to know.

Particle Filter

Here are the lessons in our Localization module:

  1. Motion
    Study how motion and probability affect your belief about where you are in the world.
  2. Markov Localization
    Use a Bayesian filter to localize the vehicle in a simplified environment.
  3. Egomotion
    Learn basic models for vehicle movements, including the bicycle model. Estimate the position of the car over time given different sensor data.
  4. Particle Filter
    Use a probabilistic sampling technique known as a particle filter to localize the vehicle in a complex environment.
  5. High-Performance Particle Filter
    Implement a particle filter in C++.
  6. Project: Kidnapped Vehicle
    Implement a particle filter to take real-world data and localize a lost vehicle.

Control

This module is built with our partners at Uber Advanced Technologies Group. Uber is one of the fastest-moving companies in the autonomous vehicle space. They are already testing their self-driving cars in multiple locations in the US, and they’re excited to introduce students to the core control algorithms that autonomous vehicles use. Uber ATG is also a Udacity hiring partner, so pay attention to their lessons if you want to work there!

A controller in the Udacity simulator.

Here are the lessons:

  1. Control
    Learn how control systems actuate a vehicle to move it on a path.
  2. PID Control
    Implement the classic closed-loop controller — a proportional-integral-derivative control system.
  3. Linear Quadratic Regulator
    Implement a more sophisticated control algorithm for stabilizing the vehicle in a noisy environment.
  4. Project: Lane-Keeping
    Implement a controller to keep a simulated vehicle in its lane. For an extra challenge, use computer vision techniques to identify the lane lines and estimate the cross-track error.

I hope this gives you a good sense of what students can expect from Term 2! Things may change along the way of course, as we absorb feedback, incorporate new content, and take advantage of new opportunities that arise, but we’re really excited about the curriculum we’ve developed with our partners, and we can’t wait to see what our students build!

In case you’d like a refresher on what was covered in Term 1, you can read my Term 1 curriculum post here.

In closing, if you haven’t yet applied to join the Udacity Self-Driving Car Engineer Nanodegree Program, please do! We are taking applications for the 2017 terms and would love to have you in the class!

--

--