AI Talks on #DeepLearning— Math, Inner workings, Optimization
I am starting this AI talks series to help engineers to acquire some fundamental machine learning and modeling skills. I intend to cover deep learning, classical machine learning, statistical modeling, math, and data engineering aspects of Artificial Intelligence during these talks. I am embedding the first 3 videos of the talks below
(I will post a link to webinars where you can attend live talks soon)
01 — Math behind Neural Networks
In this talk first talk, I introduce Neural Nets from the perspective of inner workings and the math on how learning is achieved within networks.
02 — Subtle art of optimizing Neural Networks
In this talk, I cover partitioning schemes, weight initialization, and basics of learning rate optimization
03 — Feature Scaling and Regularization of Neural Nets
This talk covers data characteristics, feature scaling, error analysis and regularization
I will post more talks as it progresses. Do let me know if this is useful for the general audience on my follow list and what would you like me to cover in my next series of talks. Happy learning.