InEthercourt Machine LearningbyWELTARE StrategiesThe Multinomial Distribution: Understanding the Probabilities Behind Your Language Models31 Generative AI Formulas to Know Before 2025 #5 — Special Edition6d ago
InTowards Data SciencebyIgor ŠegotaLogistic regression: FaceoffWhat do log-losses and perfectly separated data have to do with hockey sticks?May 18, 2023
InTowards Data SciencebyDhruv MataniNT-Xent (Normalized Temperature-Scaled Cross-Entropy) Loss Explained and Implemented in PyTorchAn intuitive explanation of the NT-Xent loss with a step-by-step explanation of the operation and our implementation in PyTorchJun 13, 2023Jun 13, 2023
InEthercourt Machine LearningbyWELTARE StrategiesMastering Cross-Entropy for AI Optimization31 Formulas to know in Generative AI before 2025 # 3— Special EditionDec 4Dec 4
Isra SheikhUnderstanding Cross-Entropy Loss and Its Role in Classification ProblemsIn 2023, machine learning is now present in almost all of the applications we interact daily. To ensure the optimal performance of…Oct 15, 2023Oct 15, 2023
InEthercourt Machine LearningbyWELTARE StrategiesThe Multinomial Distribution: Understanding the Probabilities Behind Your Language Models31 Generative AI Formulas to Know Before 2025 #5 — Special Edition6d ago
InTowards Data SciencebyIgor ŠegotaLogistic regression: FaceoffWhat do log-losses and perfectly separated data have to do with hockey sticks?May 18, 2023
InTowards Data SciencebyDhruv MataniNT-Xent (Normalized Temperature-Scaled Cross-Entropy) Loss Explained and Implemented in PyTorchAn intuitive explanation of the NT-Xent loss with a step-by-step explanation of the operation and our implementation in PyTorchJun 13, 2023
InEthercourt Machine LearningbyWELTARE StrategiesMastering Cross-Entropy for AI Optimization31 Formulas to know in Generative AI before 2025 # 3— Special EditionDec 4
Isra SheikhUnderstanding Cross-Entropy Loss and Its Role in Classification ProblemsIn 2023, machine learning is now present in almost all of the applications we interact daily. To ensure the optimal performance of…Oct 15, 2023
Sik-Ho TsangBrief Review — Imbalanced Image Classification with Complement Cross EntropyComplement Cross Entropy (CCE) for Data Imbalance DatasetSep 22
Marina FusterCross-Entropy Loss for Next Token Prediction in TransformersIn light of the tremendous success of transformers in the context of the Next Token Prediction task, I’ve decided to create this post to…Aug 6
Shripad pateFocal Loss for Dense Object DetectionFocal loss is used as an alternative to cross-entropy loss to address the issue of class imbalance, particularly in object detection tasks…Jul 91