Prem Vishnoi(cloudvala)Understanding AlexNet: A Step-by-Step Guide to the Architecture That Revolutionized Deep Learning1.Overview:3d ago
Rishabh SinghConvolutional Neural Network (CNN) — Part 1When classifying images, traditional neural networks struggle because each pixel is treated as an independent feature, which misses…Nov 6, 20242
SianThe Curse of Vanishing Gradients and How Deep Learning SurvivesThere are so many things in life where order matters — cooking recipes, the plot of a drama, even the sequence of KakaoTalk messages. Deep…5d ago5d ago
Gaurav NairThe Spark Your Neural Network Needs: Understanding the Significance of Activation FunctionsFrom the traditional Sigmoid and ReLU to cutting-edge functions like GeLU, this article delves into the importance of activation functions…Aug 22, 2023Aug 22, 2023
Kabilan VaikunthanWhy is ReLU nonlinear?We explore why exactly this seemingly linear function is in fact nonlinear, and how it manages to outperform other activation functions.Mar 5Mar 5
Prem Vishnoi(cloudvala)Understanding AlexNet: A Step-by-Step Guide to the Architecture That Revolutionized Deep Learning1.Overview:3d ago
Rishabh SinghConvolutional Neural Network (CNN) — Part 1When classifying images, traditional neural networks struggle because each pixel is treated as an independent feature, which misses…Nov 6, 20242
SianThe Curse of Vanishing Gradients and How Deep Learning SurvivesThere are so many things in life where order matters — cooking recipes, the plot of a drama, even the sequence of KakaoTalk messages. Deep…5d ago
Gaurav NairThe Spark Your Neural Network Needs: Understanding the Significance of Activation FunctionsFrom the traditional Sigmoid and ReLU to cutting-edge functions like GeLU, this article delves into the importance of activation functions…Aug 22, 2023
Kabilan VaikunthanWhy is ReLU nonlinear?We explore why exactly this seemingly linear function is in fact nonlinear, and how it manages to outperform other activation functions.Mar 5
Abhishek JainDead neurons in Deep Learning, their effects and remedies to solve itDead neurons, also called dead units or inactive neurons, refer to neurons in a deep learning model that consistently output the same value…Nov 16, 2024
KoshurAIActivation Functions: The Secret Sauce Behind AI’s BrainpowerIntroduction: Why Activation Functions Matter More Than You ThinkFeb 22
Machine Learning in Plain EnglishConvolutional Neural Network — Lesson 9: Activation Functions in CNNsThe Need for Non-Linearity: ReLU, Leaky ReLU, etc.Jun 21, 2023