Robotics Nanodegree Program Syllabus: Term One, in Depth

As announced at Udacity’s Intersect 2017 Conference, applications are now open for our groundbreaking new Robotics Nanodegree program. We are so excited to offer a program that instills fundamental robotics knowledge, and simultaneously teaches students the most cutting-edge techniques that are impacting myriad industries, and engendering positive change at a global level.

Plus, in Term Two, all accepted students will receive robot hardware* to deploy code to!

Our Approach: Practical and Industry-Aligned

In this program, we place a keen emphasis on building practical robotics system skills. Through a series of projects and exercises, you will learn to work with sensor data for the task of perception, implement and apply artificial intelligence algorithms in the process of decision making, and then command your robot to take action!

The robotics industry is growing like crazy and demand is stronger than ever for engineers with skills in this space. Companies like Bosch, Kuka, Lockheed Martin, iRobot, Uber ATG, and Alphabet’s Moonshot Factory, X, have joined us as hiring partners to get exclusive access to graduates of this program. We have built the curriculum in partnership with Electric Movement to give you, the aspiring robotics engineer, the skills and experiences needed to join this burgeoning industry.

The Robotics Nanodegree program is a six-months program, divided into two terms of three months each, with a brief break at the midway point for students to reflect, recharge, and prepare for the next term. In Term One you’ll get up to speed with the tools and techniques needed for any robotics project large or small. In Term Two, you’ll expand your robotics toolkit, and receive a hardware platform where you can run the code you’ve been developing!

Each term is composed of several lessons focusing on key topics. As you proceed through each lesson, you’ll have the opportunity to test your skills with quizzes and exercises. You’ll also submit projects, and receive detailed feedback from our expert reviewers.

Here is a week-by-week look at what we’ll cover in Term One of the Robotics Nanodegree program:

Week One : Introduction to Robotics

Expert project review, mentorship, and career services are going to be critical components of your learning experience, so we’ll take some time at the beginning of the program to orient you to all these features. The tight-knit community of students and instructors you’ll be joining is also going to be a huge part of your success, so we’ll make sure you’re signed up for the Slack community, and comfortable with using the forums. After that, we’ll dive straight into building the first project, where you’ll use fundamental computer vision techniques to navigate a simulated environment with a rover.

Project: Search and Sample Return

This first project is styled after the NASA Sample Return Robot Competition. In a simulated environment you’ll perform a search for samples of interest using some basic computer vision techniques. With just a few lines of Python code you’ll get a chance to experience the three main steps in robotics: perception, decision making, and actuation.

Week Two: ROS Basics

The Robot Operating system, or “ROS”, is an open-source framework that forms the system backbone of many robotics solutions built today. As a framework, it defines a software architecture where multiple “nodes” or processes communicate with each other by publishing and/or subscribing to messages. This becomes extremely useful as a robot system may be composed of multiple software components, sometimes running on separate pieces of hardware. In this series of lessons, you will become acquainted with the core components of the ROS framework, and you’ll begin writing code that’s capable of controlling a real robot!

Exercise: Turtlesim in ROS

In this exercise you’ll gain firsthand experience with how things interact using the ROS publish/subscribe messaging architecture. You’ll use the roscore server to communicate between several ROS nodes, all of which will be contributing to your ability to control and profile the movement of a Turtle Robot in simulation.

Weeks Three and Four: Kinematics

Robotic arms are used in numerous applications across a range of industries, and in this series of lessons you’ll learn how to manipulate a robotic arm through the application of kinematics, which is the branch of mechanics that allows you to describe the motion of objects without reference to their mass or the physical forces acting on them. Kinematics is absolutely central to robotics, because it describes the way robots actually move! You’ll develop a mathematical foundation that allows for a description of the state (position, velocity, acceleration) of a robotic arm with multiple joints and axes of rotation.

Project: Pick and Place

In this project, using what you now know of kinematics, you’ll use ROS to manipulate a robotic arm with six degrees of freedom in simulation to pick up an object from one location and put it another. You’ll need to first identify where the object is located, then successfully retrieve it and put it in another location without running into obstacles in the environment. Once you master this, you’re ready for the Amazon Robotics Challenge!

Weeks Five through Eight: Computer Vision

For robots, cameras combined with powerful computer vision techniques serve as a primary means of understanding and navigating through the environment. In this section you will learn about things like object recognition, segmentation, and how to use depth data for 3D perception. You will use the Point Cloud Library (PCL) and Random Sample Consensus (RANSAC) packages to fit a model in the presence of outliers in the data. You’ll then use clustering techniques to segment your point cloud into individual objects, and computer vision techniques for object recognition to find the object you’re looking for!

Project: Perception

Using what you’ve learned about perception, you’ll tackle the task of locating an object in a cluttered environment, then controlling a robotic arm to grab it and move it to a different location. The PR2 is an advanced two-armed robotics development platform created by Willow Garage and in this project, you will use the PR2 in simulation to accomplish this task. Here you will leverage MoveIt!, one of the most powerful software packages in the ROS ecosystem to perform collision detection and motion planning.

Weeks Nine and Ten: Controls

One of the fundamental pieces of building a good robot is writing a good control algorithm! In this series of lessons, you’ll learn the fundamentals of the algorithms used to control 95% of the world’s robots!

Exercise: Drone Flight Controller

In this exercise, you will take what you know about controls and apply it to stabilizing a quadcopter in flight. While the algorithm you will write is designed to move a flying robot to a desired position, the very same algorithm could be applied to keep a car in its lane, to move a robotic arm, or even to stabilize the rate of some chemical process!

Weeks 11 through 14: Deep Learning for Robotics

Increasingly, the perception and decision-making steps in robotics are being powered by deep neural networks. In this final section of Term One, you will have the chance to apply deep learning to perception and control tasks.

Project: Follow Me

In this project, you will train a deep neural network to identify and track a target in simulation and then issue commands to a drone to follow that target. So-called “follow me” applications like this are key to many fields of robotics and the very same techniques you apply here could be extended to scenarios like adaptive cruise control in autonomous vehicles or human-robot collaboration in industry.

This Term One curriculum gives you the foundation you need to tackle a wide range of challenges in robotics, and it also prepares you for Term Two, in which you’ll have the chance to implement these and other cutting-edge techniques using real hardware.

*Term Two Sneak Peek

Here’s a quick look at what’s in store for Term Two:

  • Motion Planning
  • Localization
  • Hardware Integration (stay tuned for additional info—we’ll cover the Term Two syllabus in depth in a future post, and provide details about your robot hardware!)

For more information about our Robotics Nanodegree Program you can check out these Frequently Asked Questions or join our public Slack team to ask us questions directly, or just to chat with the community about all things robotics. If you’re ready to jump in with both feet then apply today!