InAI-EnthusiastbyDeepankar SinghVanishing Gradient Problem: The Ghost Haunting Deep Neural NetworksDiscover the Vanishing Gradient Problem in deep learning, its causes, impact, solutions, and how it shaped modern neural network advancementDec 5
Thi-Lam-Thuy LEHow does ReLU enable Neural Networks to approximate continuous nonlinear functions?Learn how neural networks with one hidden layer using ReLU activation represent continuous nonlinear functions.Jan 217
SPImage classification using CNNConvolutional Neural Network (CNN) is a well established data architecture. It is a supervised machine learning methodology used mainly in…Feb 24Feb 24
Shivam SinghExploring Activation Functions: The Building Blocks of Neural NetworksActivation functions in neural networks serve as mapping mechanisms that transform the weighted sum of a neuron’s inputs into another…Nov 19Nov 19
Gaurav NairThe Spark Your Neural Network Needs: Understanding the Significance of Activation FunctionsFrom the traditional Sigmoid and ReLU to cutting-edge functions like GeLU, this article delves into the importance of activation functions…Aug 22, 2023Aug 22, 2023
InAI-EnthusiastbyDeepankar SinghVanishing Gradient Problem: The Ghost Haunting Deep Neural NetworksDiscover the Vanishing Gradient Problem in deep learning, its causes, impact, solutions, and how it shaped modern neural network advancementDec 5
Thi-Lam-Thuy LEHow does ReLU enable Neural Networks to approximate continuous nonlinear functions?Learn how neural networks with one hidden layer using ReLU activation represent continuous nonlinear functions.Jan 217
SPImage classification using CNNConvolutional Neural Network (CNN) is a well established data architecture. It is a supervised machine learning methodology used mainly in…Feb 24
Shivam SinghExploring Activation Functions: The Building Blocks of Neural NetworksActivation functions in neural networks serve as mapping mechanisms that transform the weighted sum of a neuron’s inputs into another…Nov 19
Gaurav NairThe Spark Your Neural Network Needs: Understanding the Significance of Activation FunctionsFrom the traditional Sigmoid and ReLU to cutting-edge functions like GeLU, this article delves into the importance of activation functions…Aug 22, 2023
Abhishek JainDead neurons in Deep Learning, their effects and remedies to solve itDead neurons, also called dead units or inactive neurons, refer to neurons in a deep learning model that consistently output the same value…Nov 16
Rishabh SinghConvolutional Neural Network (CNN) — Part 1When classifying images, traditional neural networks struggle because each pixel is treated as an independent feature, which misses…Nov 62
Andi ArdiansyahUnderstanding the ReLU Activation Function in Neural NetworksThe Rectified Linear Unit (ReLU) is a popular activation function widely used in neural networks, especially in deep learning models. Due…Nov 11