We’re Building an Open Source Self-Driving Car

And we want your help!

Oliver Cameron
Udacity Inc
4 min readSep 29, 2016

--

At Udacity, we believe in democratizing education. How can we provide opportunity to everyone on the planet? We also believe in teaching really amazing and useful subject matter. When we decided to build the Self-Driving Car Nanodegree program, to teach the world to build autonomous vehicles, we instantly knew we had to tackle our own self-driving car too.

Together with Google Self-Driving Car founder and Udacity President Sebastian Thrun, we formed our core Self-Driving Car Team. One of the first decisions we made? Open source code, written by hundreds of students from across the globe!

Our 2016 Lincoln MKZ

Why? We want to give the world the ability to contribute code to a real self-driving car that will run on the road—a learning experience that doesn’t exist anywhere else on the planet. Opportunities exist to contribute to Linux, React, and thousands of other open source projects, but nowhere can you contribute code that will run on a real self-driving car.

Another player wants to make its own self-driving car, and not for the reason you’d think. Udacity plans to build an autonomous vehicle and completely open-source the whole design. This isn’t about charity or generosity, it’s about education. – Futurism

To bring this open source project to life, we bought a car: a 2016 Lincoln MKZ, to be exact. We installed sensors and parts: 2 Velodyne VLP-16 LiDARs, 1 Delphi radar, 3 Point Grey Blackfly cameras, an Xsens IMU, an ECU, a power distribution system, and more! We configured ROS, wrote a lot of code, and now we’re ready to build and refine an open source self-driving car with the help of students from around the world.

One of our Velodyne VLP-16's

Like any open source project, this code base will require a certain amount of thoughtfulness. However, when you add a 2-ton vehicle into the equation, we also need to make safety our absolute top priority, and pull requests just don’t cut it. To really optimize for safety, we’re breaking down the problem of making the car autonomous into Udacity Challenges.

Challenges

Each challenge will contain awesome prizes (cash and others) for the most effective contributions, but more importantly, the challenge format enables us to benchmark the safety of the code before we ever think of running it in the car. We believe challenges to be the best medium for us to build a Level-4 autonomous vehicle, while at the same time offering our contributors a valuable and exciting learning experience.

Today we’re announcing the opening of our second challenge to the world (Using Deep Learning to Predict Steering Angles), as well as the winner of our beta challenge.

Beta Challenge #1 — Complete

Read more about Challenge #1

Our Point Grey Blackfly cameras, while amazing, only provided simple tripod mounts, which don’t provide support for the lens when mounted in a car. Cars are often bumpy, unpredictable, and the data we record must be consistent, otherwise hours of driving are rendered useless. We also noticed (via YouTube videos) that both Nvidia and Comma use 3D printed hardware for mounting their cameras, which inspired the first beta challenge: Design a mount for our cameras to attach to a GoPro mount! Deliverable: a 3d model.

How did it go? Amazingly! We beta tested our first challenge with a small set of Udacity students and self-driving car enthusiasts. We had 26 participants, all of whom collaborated and created varying versions of the mount. The winning result, verified after many 3D prints, was a mount which works great!

You can read more information about the results of this challenge, and find the mount open sourced on GitHub.

Challenge #2 — Now Open!

Read more about Challenge #2

You may have seen this incredible video from Nvidia, one of our Nanodegree program partners, which highlights their efforts at teaching a car how to drive using only cameras and deep learning. Their DAVE-2 deep learning system is capable of driving in many different weather conditions, avoiding obstacles, and even going off-road! You may have noticed their setup looks pretty similar to our 2016 Lincoln MKZ, and that’s for good reason. One of the first ways that we want to get this car on the road is to implement a similar end-to-end solution, and release that to the world for free. The second challenge for the Udacity Self-Driving Car initiative is to replicate these results using a convolutional neural network that you design and build!

Learn more about this challenge and how to contribute.

Next Steps

We can’t wait to see what our contributors come up with, and we hope you have as much fun as we’ve had working on this project. Challenge #2 is open now, and you can learn more about how to contribute.

It’s a very simple instance of a law that is fundamentally true: Technology is moving so fast, that by definition when something becomes hot, the skill set doesn’t exist – Sebastian Thrun

Any questions? Let’s talk! Feel free to chat with me on Twitter at @olivercameron.

--

--

Oliver Cameron
Udacity Inc

Obsessed with AI. Built self-driving cars at Cruise and Voyage. Board member at Skyways. Y Combinator alum. Angel investor in 50+ AI startups.