AI Saturdays Bangalore Chapter — Week 1 Reflections

Sriram Jaju
AI Saturdays
Published in
3 min readAug 13, 2018

Mission: We started out little community of AISaturdays in Bangalore to bring together like-minded people to help each one teach one.

We have begun our journey and the response has been encouraging. The community is absolutely stellar — people are friendly, supportive, engaging, and genuinely want to help out.

Thanks to Go-Jek Tech India for providing us wonderful venue.

Source: https://giphy.com

Being an ambassador is not an easy job but we have help from other ambassadors who are motivated and inspiring when it comes to building an community of self-motivated learners.

We started our Week 1 by going thru course 1 of Deeplearning.ai by Andrew Ng. Here are some of the topics we covered:

What is a neural neural network? — A model that, taking inspiration from the brain, is composed of layers (at least one of which is hidden) consisting of simple connected units or neurons followed by nonlinearities.

Why Deep Learning is Radically Different from Machine Learning?

Logistic Regression — A model that generates a probability for each possible discrete label value in classification problems by applying a sigmoid function to a linear prediction. Although logistic regression is often used in binary classification problems, it can also be used in multi-class classification problems (where it becomes called multi-class logistic regression or multinomial regression).

What is Gradient Descent? — A technique to minimize loss by computing the gradients of loss with respect to the model’s parameters, conditioned on training data. Informally, gradient descent iteratively adjusts parameters, gradually finding the best combination of weights and bias to minimize loss.

Backpropagation —

The primary algorithm for performing gradient descent on neural networks. First, the output values of each node are calculated (and cached) in a forward pass. Then, the partial derivative of the error with respect to each parameter is calculated in a backward pass through the graph.

Activation Functions — A function (for example, ReLU or sigmoid) that takes in the weighted sum of all of the inputs from the previous layer and then generates and passes an output value (typically nonlinear) to the next layer.

Which Activation Function Should I Use?

What are the advantages of ReLU over sigmoid function in deep neural networks?

Why Initialize a Neural Network with Random Weights?

We did hand-on exercises on implementing Activation function in Python, exercises on Numpy and Pandas. We will be doing more hands-on coding in coming meetups because as they say..

Neurons that fire together wire together

It was overwhelming to accomplish all this this in Week 1 and I want to thank all the attendees for being eager and accommodating.

Source: https://giphy.com

I will list down all the resources we discussed in the meetup to help other readers(including future me):

This wouldn’t have been possible without the help of other ambassadors, Thank you.

Meetup in progress.

Tell us how was your week 01 in our #bangalore Slack channel or in comments below.

Source: https://giphy.com

Last but not least:

  1. Sign up here to attend next meetups.
  2. All the discussed materials related to the meetup can be found on Github repo.
  3. Follow AISaturdays Bangalore on twitter.
  4. You can find me on twitter.

We will deep dive into other Deep learning concepts, do hands-on coding, build things from scratch, so stay tuned in. See y’all next Saturday.

Namaste!

--

--