Learning Deep Learning — fast.ai vs. deeplearning.ai

Mark Ryan
Zero Equals False
Published in
4 min readNov 2, 2017

--

I read with interest the superb post on Andrew Ng’s Deep Learning courses by Arvind Nagaraj. This though-provoking article explained very eloquently Dr. Ng’s approach to the topic. It also pointed me towards Jeremy Howard’s fast.ai Deep Learning course, and thus began a dilemma for me: how to spend the time that I have available for learning more about deep learning: Andrew Ng’s deeplearning.ai courses or Jeremy Howard’s fast.ai course?

I got my initial taste of machine learning thanks to Andrew Ng’s ML introduction course. When I decided that I wanted to investigate deep learning, it was natural to explore the courses in Ng’s Deep Learning Specialization. All the elements of these courses (style of instruction, notation, and navigation) were familiar from the machine learning intro.

So far, so good. Of the 5 courses in the specialization I completed the four that were available as of late 2017. After the third course, while I was waiting for the fourth course to become available, I read Mr. Nagaraj’s post and decided to take a look at the fast.ai deep learning course.

My initial impressions of the fast.ai course 1 (the basis for the original version of this article) were based on version 1 of the course (available in 2017). Once version 2 became available in early 2018 I redid some of the lessons and completed the other lessons that I had not done in version 1. I have reworked this article to reflect my impressions based on version 2 of the fast.ai course.

There are several ways to contrast the deeplearning.ai set of courses with fast.ai course 1:

  1. Time commitment: the deeplearning.ai course can give the impression of faster progress because each week is a distinct chunk that can be completed in a few hours. By comparison, fast.ai lessons are open-ended and can take a couple of days straight time to work through thoroughly.
  2. Coding environment: the coding environment for deeplearning.ai is completely curated and ready to go. This means you can attack coding assignments immediately without spending any time on setup. By comparison, fast.ai recommends a couple of potential coding environments that you need to set up yourself. I chose paperspace, for which fast.ai provides relatively straightforward instructions. An unexpected benefit of paperspace is the modest hourly cost: putting a price on compute forced me to think twice before kicking off a run and to look for ways to make experiments more efficient.
  3. Framework: deeplearning.ai uses TensorFlow, while version 2 of the fast.ai course uses PyTorch. I haven’t dug deep into either framework (for better or worse, I’ve used Keras), but I think that overall PyTorch is a better choice for learners. This article is a great contrast between these two frameworks. For my money, while TensorFlow has a huge community and is pretty well a de facto standard for deep learning, the (pre 2.0) TensorFlow approach of defining a graph statically before the model can run makes my teeth hurt.
  4. Feedback: if you pay for the Coursera subscription, you can submit quizzes and coding assignments for deeplearning.ai and get immediate feedback. By contrast, fast.ai does not have a venue for providing direct feedback to people taking the course online, though there is a big community of people working through the courses at any point in time, so it’s possible to get answers to questions.
  5. Teaching style: deeplearning.ai material is aimed at online learners, with the videos recorded “in studio” explicitly for an online audience. The fast.ai videos, on the other hand, are recordings of a live, in-classroom delivery of the course. At first I preferred the deeplearning.ai approach, in part because I was used to it from Ng’s ML intro course. After going through more of the fast.ai lessons, however, I strongly prefer the classroom recordings. The students in the class ask really good questions — often on points that I was confused about but hadn’t distilled into a cogent question — and this interplay between Howard and the class is really, really useful.
  6. Applicability: I’ve attempted to apply what I’ve learned in these courses to a couple of problems in my day-to-day work, and there is absolutely no comparison between deeplearning.ai and fast.ai on applicability. fast.ai is the winner hands down. The fast.ai coding examples are more applicable to my area of interest and more detailed. By comparison, the deeplearning.ai coding examples are too simplistic to be useful as templates for a real-world problem. I think this is the cost of deeplearning.ai’s frictionless coding environment and automated coding assignment feedback.
  7. Course material organization: both courses have a lot of supplemental material. The deeplearning.ai material is better organised and easier to search, for example if you need to review a particular concept. Compared to version 1 of the course, version 2 fast.ai material is easier to navigate, but it’s still not as smooth as deeplearning.ai.
  8. Theoretical material: deeplearning.ai has an edge on theoretical material, and Ng’s style of providing “good enough” explanations of the math behind deep learning is very good. That being said, the weakness of the deeplearning.ai coding examples undermines the value of the theoretical explanations because it’s too easy to skim the surface, get your “green check mark” and not really understand what’s going on.

The fast.ai course has had a big impact on me. It set me on a path to investigate deep learning with tabular, structured data that culminated with me writing a book for Manning Publications: Deep Learning with Structured Data.

Overall, if your goal is to be able to learn about deep learning and apply what you’ve learned, the fast.ai course is a better bet. If you have the time, interleaving the deeplearning.ai and fast.ai courses is ideal — you get the practical experience, applicability, and audience interaction of fast.ai, along with the organised material and theoretical explanations of deeplearning.ai.

Running all night at 1$/hour

--

--

Mark Ryan
Zero Equals False

Technical writing manager at Google. Opinions expressed are my own.