Binge Learning from Deeplearning.ai

I completed an intro Machine Learning class from Andrew Ng on Coursera last year. Here’s a good review of it by Arthur Chan who is also the moderator for the popular Facebook AIDL group and related newsletter. I loved Prof. Ng’s teaching style and have been following him on social media since then.

When he announced a new Deep Learning specialization last month, I knew I had to do it. It was a simple (ha!) matter of finding the time. More so after reading a great review by Arvind N and feedback from other fast learners on the FB study group dedicated to this class.

I decided that this long labor day weekend (US) is the time binge it. I also wanted to get this done before he releases CNN and RNN/LSTM courses likely later this month so I can better participate with the FB study group. The weekend is almost up and happy to say mission accomplished! Here are my completion certs for course1, course2 and course3 of this specialization. Kudos to Prof. Ng and the staff for a well executed set of DL courses so far.

A bit of background. I ‘m a full stack software engineer/manager. Current stack is MERN/Swift/Java Android/AWS/bash/Mac/docker/linux, occasional Python, Ruby and very occasional Kotlin and Go . Before that I messed around a few years with C/C++ and many years of various of Java/SQL and event driven/async systems.

Hoping to add a lot of ML aspects into the mix with special interest in constrained environments like mobile/IoT like MobileNet/Mobile TensorFlow, Shufflenet, CoreML Caffe2Go. (Loved reading the story behind not-hotdog app BTW). Also very interested in applied ML like production ML data pipelines using GPUs/TPUs etc.

Random Thoughts on the 3 Courses

My familiarity with Prof. Ng’s lecture/speaking style enabled me to run the videos at 1.5 and even 2.0 speed sometimes. I ‘m not doing that with the DL heroes videos as I ‘m not as familiar with their speaking style. I spent more time reading through and experimenting with the programming assignments as they solidify the concepts. I also set up some of the course notebooks to run locally by downloading some dependent resources from Coursera.

The first two courses in this deeplearning.ai specialization really felt like in-depth follow-ups to classes 2,3,4,5 and to some extent 6 of the intro ML class. Aside from improving my intuition on NNs in general the topics related to related to hyper parameters tuning and optimization and structuring projects are quite helpful. The TF intro where he shows how to simplify implementation of similar functionally done via numpy in earlier weeks is great too.

“Saddle point optima”, “ball rolling down in a bowl”, “Pandas vs. Caviar” are some nice summary phrases for critical concepts. The first one was probably around for a while but I didn’t know about that phrase until this class.

The Soothing Professor

The great thing about Prof. Ng’s classes is that he does good bit of hand-holding and “spoon feeding” for instance in the programming assignments and also where calculus/derivatives are involved. This is critical to build basic foundations without getting distracted/losing the big picture and I wish more instructors would use this approach on new ML topics. Here are some soothing terms sprinkled all over the lecture videos.

“For those of you a bit more familiar calculus …. (Its ok if you don’t follow/get)”
“Don’t worry about it. “
“You don’t need to know ….”
“Deep learning frameworks like tensorflow, PaddlePaddle, keras or caffe come with a dropout layer implementation. Don’t stress — you will soon learn some of these frameworks.”
“To save you from doing the calculus, you should get ….”

Expectations after completing

Completing these (or any) classes will not make you a ML expert but definitely will make one more than ready for Kaggle, enable one to read and understand some literature, work on hobby, meet ups or professional projects in ML.

To be able to understand the more advanced research published on Arxiv or ICML, emerging trends or solve challenges one needs to run the “Idea, Code, Experiment” loop suggested by Prof. Ng on as many real world projects as possible. Also, the idea of build your first system quickly and iterate will be quite familiar to followers of lean startup method where it is critical build Minimum Viable Product quickly and iterate to reduce risk.

I think the intro ML class was released in 2013 or so. There was a statement in one of the videos in that class along the lines of “If you complete this class you know as much as most people using ML already know”. This is definitely not true anymore and may have been removed from the current version of that class. That type of language is not there in the deeplearning.ai specialization videos. Instead phrases like “will help you break into AI” are used.

Getting Ready for Courses 4 and 5

As a prep work to courses 4 and 5 I found videos by Brandon Rohrer quite helpful so far on high level CNN and RNN/LSTM intuition. While I have not yet done cs231n, lecture10 on combining CNN with LSTM lecture for image captioning could be handy prep work for 4 and 5.

Feedback

Feedback can be sent to feedback AT deeplearning DOT ai or via twitter #deeplearniNgAI

If courses 4 and 5 can use even higher level Keras abstractions with TensorFlow as backend that would be cool. Keras already supports TensorFlow as a backend. Work is underway for Keras to support other computation backends like MxNet and CNTK .

Help us develop intuitions into combining different NN techniques like CNN+RNN+LSTM+search methods and may be even address topics specific to ML on mobile/constrained envs.

Longer term, given that Prof. Ng’s PhD thesis is in the area of RL, a course related to that would also be great to aid self learners struggling with CS294 or trying to understand the intricacies of how AlphaGo did what it did!