Ashish PawarThe Over-Reliance on Softmax in Generative Models: Time to Move On?Explore why Softmax might actually be limiting generative models and peel apart some of its lesser-known flaws.Nov 15
InTowards Data SciencebyDr. Roi YehoshuaDeep Dive into Softmax RegressionUnderstand the math behind softmax regression and how to use it to solve an image classification taskMay 25, 20232
InTowards Data SciencebyGabriel FurnielesSigmoid and SoftMax Functions in 5 minutesThe math behind two of the most used activation functions in Machine LearningSep 8, 20221Sep 8, 20221
InAI-EnthusiastbyDeepankar SinghGuardians of the Network: Sigmoid, Softmax, ReLU, and Tanh on a Heroic QuestDiscover the power of activation functions in neural networks! Dive into the essentials of Sigmoid, Softmax, ReLU, and Tanh.Nov 5Nov 5
InTowards Data SciencebyBrian WilliamsContrastive Loss ExplainedContrastive loss has been used recently in a number of papers showing state of the art results with unsupervised learning. MoCo, PIRL and…Mar 3, 202012Mar 3, 202012
Ashish PawarThe Over-Reliance on Softmax in Generative Models: Time to Move On?Explore why Softmax might actually be limiting generative models and peel apart some of its lesser-known flaws.Nov 15
InTowards Data SciencebyDr. Roi YehoshuaDeep Dive into Softmax RegressionUnderstand the math behind softmax regression and how to use it to solve an image classification taskMay 25, 20232
InTowards Data SciencebyGabriel FurnielesSigmoid and SoftMax Functions in 5 minutesThe math behind two of the most used activation functions in Machine LearningSep 8, 20221
InAI-EnthusiastbyDeepankar SinghGuardians of the Network: Sigmoid, Softmax, ReLU, and Tanh on a Heroic QuestDiscover the power of activation functions in neural networks! Dive into the essentials of Sigmoid, Softmax, ReLU, and Tanh.Nov 5
InTowards Data SciencebyBrian WilliamsContrastive Loss ExplainedContrastive loss has been used recently in a number of papers showing state of the art results with unsupervised learning. MoCo, PIRL and…Mar 3, 202012
Aneesh vesapagaActivation Functions in ANN (Artificial Neural Networks): A Simple Examples and GraphsWhat is an Activation Functions?Oct 22
Hunter PhillipsA Simple Introduction to SoftmaxSoftmax normalizes an input vector into a probability distribution using the exponential function.May 10, 2023
HarrietSoftmax Uncovered: Balancing Precision with Numerical Stability in Deep LearningExploring the Softmax Function, Numerical Stability Techniques, and Hands-on Code ExamplesSep 18