# Deep Learning Nanodegree Foundation Program Syllabus, In Depth

It is my pleasure today to join Siraj Raval in introducing an amazing new Udacity offering, the Deep Learning Nanodegree Foundation Program, and to share with you the exceptional curriculum we have developed in partnership with Siraj.

As with so many of you, we are incredibly excited about the possibilities of deep learning and its ability to transform technology today. Deep learning-based AI techniques have already surpassed human proficiency in object detection and speech recognition, and we are just barely scratching the surface of what’s possible!

We want to create a pathway that enables as many people as possible to enter this incredible field, which is why we are introducing our Deep Learning Nanodegree Foundation Program with Siraj Raval. This program is a 17-week, hands-on introduction to this transformational technology, designed for those who have at least an intermediate Python background.

Below, you’ll see our **curriculum**, covering Convolutional Neural Networks, Recurrent Neural Networks, Reinforcement Learning, and other applications of Deep learning.

**Enrollment opens today, and closes January 20th. ****Enroll now****, and start building your Deep Learning foundations today!**

## Week 1: Types of machine learning, when to use machine learning

We’ll start off with a simple introduction to linear regression and machine learning. This will give you the vocabulary you need to understand recent advancements, and make clear where deep learning fits into the broader picture of ML techniques.

## Week 2: Neural Network Architecture + Types (numerical classification)

We’ll then start exploring neural networks in depth and understand various canonical architectures such as AlexNet, LeNet, and others. We’ll use these neural networks to automatically convert images of numbers into their corresponding digits.

## Week 3: Cloud computing + sentiment analysis (text classification)

We’ll train deep neural networks in the cloud using GPUs and see how we can use these models on text to do simple sentiment analysis.

## Week 4: Math notation + recommendation systems (algebra, calculus, matrix math)

We’ll then explore the world of recommendation systems such as those used in Netflix, Amazon, and others. You’ll also be provided a general introduction to the linear algebra that will help you throughout your deep learning coursework.

## Week 5: Data preparation (cleaning, regularization, dimensionality reduction)

One of the key parts of applying deep learning in practice is collecting the right type of training data. In this lesson, we’ll explore a variety of techniques to clean and regularize your data so that you can train effective models.

## Week 6: Drone image tracking (image classification with CNNs)

Convolutional Neural Networks (CNNs) are currently one of the most exciting advancements in neural networks, given that CNNs can now classify objects in images better than humans can. In this lesson, we’ll learn the intuition behind these networks and use them to track images of drones.

## Week 7: Stock prediction (regression with RNNs)

In this lesson, we’ll learn about Recurrent Neural Networks — a type of network architecture particularly well suited to time series data. We’ll apply our understanding of these networks on some of the most important time series data we have — stock prices!

## Week 8: Art generation (transfer learning)

Beyond simply predicting, deep neural networks are now also capable of generating music, images, and art based on samples. In this lesson, we’ll use neural networks to create new art based on artwork we feed in, using a technique known as Style Transfer.

## Week 9: Music generation (LSTMs applied to Audio)

Neural networks can also be applied to problems in audio, as the famous Wavenet paper by DeepMind has shown. In this lesson, we’ll use a type of Recurrent Neural Network called LSTMs (Long Term Short Term Memory) to generate new pieces of music based on existing samples.

## Week 10: Poetry generation (LSTMs applied to NLP)

We’ll similarly extend our domain to include text and language as we use LSTMs to generate novel writing samples based on training data.

## Week 11: Language translation (sequence to sequence)

Neural Networks have been a fundamental part of the recent advancements in machine translation. The latest production versions of Google Translate and Baidu Translate both use deep learning architectures to automatically translate text from one language to another. This is done using a process known as Sequence to Sequence Learning, which we will explore in this lesson.

## Week 12: Chatbot QA System with voice (sequence to sequence more in-depth)

We’ll further explore Sequence to Sequence learning through building our very own Chatbot QA system that can answer unstructured queries from a user.

## Week 13: Game bot 2D (Reinforcement Learning via Monte-Carlo tree search)

Some of the most interesting advancements in deep learning have been in the field of Reinforcement Learning, where instead of training on a corpus of existing data, a network learns from live data it receives and adjusts accordingly. We’ll see how to apply Reinforcement Learning to build simple Game-Playing AIs that can win in a wide variety of Atari games.

## Week 14: Image compression (Autoencoders)

As recently shown by Google, deep learning can also be used to dramatically improve compression techniques. In this lesson we’ll explore using deep learning to build Autoencoders that automatically find sparse representations of data.

## Week 15: Data visualization (anomaly detection results in 2D and 3D)

In this lesson, you’ll apply deep learning to detect anomalies in data. This is extremely useful in applications such as fraud prevention with credit cards.

## Week 16: Image generation (generative adversarial networks)

As echoed by Yan LeCunn, Generative Adversarial Networks are one of the most fundamental advancements in deep learning. You’ll explore this state of the art concept to generate images that most humans wouldn’t believe are generated by a computer.

## Week 17: One-shot learning (Probabilistic Programming)

Finally, we’ll look at one-shot learning, where our neural network is able to just learn from one (or a few) example, as opposed to a large amount of data.

Through this curriculum, you will absorb an exciting introduction to some of the most compelling advancements in deep learning! We hope you join us on this journey and we can’t wait to share more of these ideas with you.