Udacity’s Self-Driving Car Nanodegree — Term 3 Review
My wrap-up post on Term 3 of Udacity’s awesome Self-Driving Car Nanodegree program!

I ended up taking a break between Terms 2 & 3 to work a personal project (more on that in a future post) so Term 3 kicked-off for me in Feb 2018 instead of Jan 2018 and wrapped up, a little earlier than expected, in the first half of April. This term felt more accelerated than the last with most of the time being split between the first path planner project and the last capstone project. That being said, I ended up on a great team for the capstone from an earlier cohort that motivated me to keep pace and work harder to finish the program early.
Preparing for the term
Keeping with tradition, here’s a breakdown of Term 3 courtesy of David Silver. The focus of this term is adding depth to some of the concepts introduced in Term 1 and Term 2 as well as introducing ROS and its benefits when writing software for self-driving cars.
I went into this term without any targeted prep work like I did for the first two terms. I did, however, benefit from building out a personal project using a fully convolutional network which gave me a pretty good headstart for the second project. That being said, here’s a couple of items that are inevitably useful if you’re looking to get ahead in Term 3:
- The ‘Search’ lectures in AI for Robotics (Udacity) to understand A*
- CS231n ‘Detection & Segmentation’ Lecture
- ROS Tutorials
Term Breakdown
This final term is structured into 3 main projects supported by course material:
- Path Planning
- (Elective) Semantic Segmentation OR Functional Safety
- System Integration
Path Planning
The path planning module picks up where Term 2 left off and starts to thread together the concepts of sensor fusion, control and localization to create a path planner that enables a self-driving car to get from Point A to Point B.
Things kick off with an introduction to the A* algorithm to find an optimal path through a maze followed by a quick explanation and exercise involving a Naive Bayes classifier for predicting the future state of vehicle trajectories. The next few lessons deal with a Finite State Machine representation of a vehicle state and using cost functions to determine the next best state to for the vehicle (e.g. prepare for right turn, change lane, stay in lane). We then jump into the hybrid A* and the use of heuristics and the lectures wrap up with a derivation of jerk minimized trajectories using quintic polynomials with boundary conditions that define the start and end vehicle position, velocity & acceleration.
This module culminates with the Path Planning project which puts together all the introduced concepts to drive an ego vehicle (read: your car) on a highway while safely maneuvering around traffic.
The project itself was very interesting, although the walk-through gives away most of it. I was somewhat underwhelmed that there’s no sensor fusion component and that the sensor results are just handed to us but I figured this was required to restrict the scope of the project to strictly path planning. I think Udacity would also benefit by introducing a challenge portion to this project that involves the use of A* in a parking lot environment to make things more interesting. I’d recommend spending some time doing this project without watching the walk-through to get the most out of it.

Semantic Segmentation (Elective)
I decided to go with the semantic segmentation elective because who doesn’t love FCNs?
This module introduces the FCN architecture and the techniques used to turn a model that generates a bounding box into a pixel-wise mask that detects an object of interest. The lectures run introduce the concepts of up-sampling, transpose & 1x1 convolutions and skip connections to turn traditional object detectors into FCNs.
What I found of particular interest in this module were the methods used to improve inference performance of a trained model by optimizing the TensorFlow graph to merge operations, drop portions of the graph that aren’t reached and quantize trained weights.
The project for this elective is segmenting all road pixels in a video feed of a car driving around in a city. As an extra challenge, the data is labeled to distinguish between different sides of a carriageway so the project can be extended to segment between the traveled side and opposite side of the road. Once again, the walk-through gives away most of the project so it would be beneficial to try and do this project without watching it.

System Integration
The final module of the SDC program deals exclusively with providing an introduction to the Robot Operating System (ROS) and how to use it. The lectures cover everything from installing and setting up ROS, understanding it’s pub/sub architecture to writing your own ROS nodes to use on Carla, Udacity’s self-driving car that your code will need to navigate around a parking lot.
It would be very beneficial to have a working knowledge of ROS going into this project so I highly recommend completing the tutorials available on the ROS website prior to the start of the term.
This capstone project is the only one in the curriculum that needs to submitted as a team project if you want your code to run on Carla since individual submissions are only graded on the simulator. Finding a team is relatively straight forward as long as you don’t leave it to the last month of the term. I’d recommend starting your team search right at the beginning of Term 3.
The system integration project involves writing ROS nodes for updating waypoints, vehicle control and traffic light detection to navigate a vehicle in a simulator. Once your vehicle completes the simulator track, the code is submitted to be run on Carla. Udacity provides a ROS bag of all your ROS messages for debugging after your first Carla run and you have the option to re-submit your code for a second attempt.
One of the biggest challenges with this project was being able to run the simulator and full code functionality in real-time. Udacity provides an Ubuntu VM pre-setup with ROS that was too laggy to be completely functional, especially when running traffic light classification. A native Ubuntu setup on a gaming rig is what eventually worked for my team.
Reflections
The term was definitely challenging and exciting, particularly the last project. Here are some of my thoughts:
- I ended up spending roughly 15 hours a week on this term, more so because I picked a team from an earlier cohort which allowed me to finish the program earlier than scheduled.
- I’d recommend spending most of your time on the first (Path Planning) and third module (System Integration) of this term. I found the second module was a lot less intense. This gives you the added benefit of spending more time on the capstone project.
- Udacity introduced the Intro to SDC nanodegree since I started the curriculum to help students build a stronger foundation coming into the SDC nano-degree. I think that’s a great step but they should also consider using this as an opportunity to increase the difficulty of the SDC nanodegree by moving some material to the Intro to SDC and extending the existing SDC Projects to make them more real-world.
- In particular, I think it would be beneficial to add traffic into the vehicle simulator for the Capstone project and include a sensor fusion ROS node.
The SDC nanodegree program has been a rewarding learning experience. I’d highly recommend this program to anyone considering learning more about how self-driving cars operate. While it does reduce the barrier to entry, I think those serious about starting or switching careers into this industry will want to extend their portfolio of projects with more real-world projects. This program teaches you to ask the right questions and provides the context to be able to go deeper into various areas of the autonomous vehicle tech stack.
It was a bittersweet feeling when the program ended but I’m now excited about using what I’ve learned to start working on the next set of projects that I’ve been thinking about!

