Rayan YassminhPractical Applications of Information Theory in Machine LearningThe landscape of data tools is ever-changing. Each day, new libraries and tools are introduced, but they all rely on fundamental…Jul 8
Isra SheikhUnderstanding Cross-Entropy Loss and Its Role in Classification ProblemsIn 2023, machine learning is now present in almost all of the applications we interact daily. To ensure the optimal performance of…Oct 15, 2023
Varun NakrainTowards AIPerfect answer to Deep Learning interview question — Why not quadratic cost function?One of the most common question asked during deep learning knowledge interviews is — “Why can’t we use a quadratic cost function to train…Jun 1Jun 1
Inara Koppert-AnisimovainunpackCross-Entropy Loss in MLWhat is Entropy in ML?Jan 4, 20211Jan 4, 20211
Vijay MauryaUnderstanding Loss Functions in Deep Learning for ClassificationLoss functions play a crucial role in training deep learning models, especially in tasks like classification where the goal is to minimize…Mar 27Mar 27
Rayan YassminhPractical Applications of Information Theory in Machine LearningThe landscape of data tools is ever-changing. Each day, new libraries and tools are introduced, but they all rely on fundamental…Jul 8
Isra SheikhUnderstanding Cross-Entropy Loss and Its Role in Classification ProblemsIn 2023, machine learning is now present in almost all of the applications we interact daily. To ensure the optimal performance of…Oct 15, 2023
Varun NakrainTowards AIPerfect answer to Deep Learning interview question — Why not quadratic cost function?One of the most common question asked during deep learning knowledge interviews is — “Why can’t we use a quadratic cost function to train…Jun 1
Vijay MauryaUnderstanding Loss Functions in Deep Learning for ClassificationLoss functions play a crucial role in training deep learning models, especially in tasks like classification where the goal is to minimize…Mar 27
Chady DimachkieinNtropyIs Cross-Entropy All You Need? Let’s Discuss an AlternativeWe discuss a the Smooth Generalized Cross-Entropy loss function that adresses the issue of noisy labels, confidence calibration and moreJul 10, 20232
AI SageScribeSimple Explanation of Normalized Temperature Scaled Cross Entropy Loss (NTSCE Loss)In the intricate world of machine learning, the quest for not just accuracy but also for reliable and interpretable predictions is…Dec 29, 2023
AllohvkinTowards Data ScienceCross entropy & classification losses — No math, few stories & lots of intuitionThere is something to be gained from every lossMar 16, 20215