AI Saturdays Bangalore Chapter — Week 2 Reflections.

After kicking off to an amazing start with the winter cycle of 2018 AI Saturdays, it couldn’t have gotten any better for our next session. With participation from over 60 people from different walks of their careers from students to seasoned tech professionals trying to understand the subject of AI and over 100 people joining over the live stream, it was a Saturday filled with loads of enthusiasm towards this path breaking technology. In this blog, i will cover the topics that were discussed in detail in the session and also share some resources for your learning progress to these concepts.

In the previous session, we had discussed about the workings of a neural network, basic variant of gradient descent algorithm for optimization process later extending it to back propagation algorithm and various activation functions used in the field currently, which mostly constitutes the first course of specialization.

In this week, we delved deeper into tuning the deep learning models using regularization techniques, discussed the variants of gradient descent algorithms and introduced a few sophisticated algorithms like RMSProp and ADAM with their thorough explanations. We then went on to discuss how techniques like batch normalization will help us attain the minima much faster. We later encountered a way to check the fidelity of backpropagation in our network and how to overcome the vanishing/exploding gradient problem partially using weight initialization. In-depth guides to all these topics have been given below.

Topics discussed:

1. L1 and L2 regularization

2. Dropout regularization and early stopping

3. Gradient Checking.

4. Variants of Gradient descent and other optimization algorithms.

5. Batch Normalization.

6. Softmax classifier.

All the above components together make a whole deep learning model along with another few elements to it which have been discussed in the previous two sessions. The participants have now gained a valuable experience and insight towards the various components in a neural network model and are capable of building a well generalized deep learning model.

We will now head towards solving real world scenario problems with hands-on coding from upcoming meetups after getting an in-depth theoretical understanding of the details from the past two sessions that we had conducted. The enthusiasm shown by the Bangalore members has been ecstatic and me along with the other ambassadors of the Bangalore chapter are thrilled to take it forward to newer heights from here.

And finally we would like to thank for being a great host:)

In our upcoming session we are going to cover the third course from and introduce you guys to the amazing PyTorch framework which will be held in Nvidia office. After which we will dive deep into building some complex deep learning models to solve problems and see their amazing capabilities in the upcoming sessions.

  1. To attend the next session, fill out the form here.
  2. Assignments for the sessions conducted till date can be found here
  3. Sign up here to attend next meetups.
  4. All the discussed materials related to the meetup can be found on Github repo.
  5. Follow AISaturdays Bangalore on twitter.