InTowards Data SciencebyDhruv MataniNT-Xent (Normalized Temperature-Scaled Cross-Entropy) Loss Explained and Implemented in PyTorchAn intuitive explanation of the NT-Xent loss with a step-by-step explanation of the operation and our implementation in PyTorchJun 13, 2023
Yashwanth SRegression Series -06: Logistic Regression for ClassificationImagine you’re a doctor trying to predict whether a patient will develop a disease based on their symptoms and medical history. Or, picture…Jul 31Jul 31
Noor FatimaUnderstanding Perceptron Loss Function, Hinge Loss, Binary Cross Entropy, and the Sigmoid FunctionThe realm of machine learning and neural networks is vast, and understanding foundational concepts is crucial for building robust models…Jul 24Jul 24
Mustafa604GANs SpecializationEmbarking on the journey of understanding GANs specialization from deeplearning.ai can be both exhilarating and daunting specially for…Jun 10Jun 10
InTowards Data SciencebyDhruv MataniNT-Xent (Normalized Temperature-Scaled Cross-Entropy) Loss Explained and Implemented in PyTorchAn intuitive explanation of the NT-Xent loss with a step-by-step explanation of the operation and our implementation in PyTorchJun 13, 2023
Yashwanth SRegression Series -06: Logistic Regression for ClassificationImagine you’re a doctor trying to predict whether a patient will develop a disease based on their symptoms and medical history. Or, picture…Jul 31
Noor FatimaUnderstanding Perceptron Loss Function, Hinge Loss, Binary Cross Entropy, and the Sigmoid FunctionThe realm of machine learning and neural networks is vast, and understanding foundational concepts is crucial for building robust models…Jul 24
Mustafa604GANs SpecializationEmbarking on the journey of understanding GANs specialization from deeplearning.ai can be both exhilarating and daunting specially for…Jun 10
InAdvanced Deep LearningbyFrederik vom LehnComplete Guide to Neural NetworksBased on a given dataset, a neuronal network creates a function f which maps the relationship between features Xi and labels Y. The…Aug 18, 20233
Ebrahim MousaviML Series: Day 13 — Logistic Regression (Part 2)Exploring the Squashing Method: An Enhanced Approach for Outlier Resilience in Logistic RegressionMar 26
SahilcarterrWhy nn.BCEWithLogitsLoss Numerically StableNumerical stability is a 🌟 crucial consideration in machine learning .BCEWithLogitsLoss is a loss that combines a Sigmoid layer and the…Dec 26, 2023