Member-only story
Contrastive Learning in 3 Minutes
The exponential progress of contrastive learning in self-supervised tasks
Deep learning research has been steered towards the supervised domain of image recognition tasks, many have now turned to a much more unexplored territory: performing the same tasks through a self-supervised learning manner. One of the cornerstones that lead to the dramatic advancements in this seemingly impossible task is the introduction of contrastive learning losses. This article dives into some of the recently proposed contrastive losses that have pushed the results of unsupervised learning to heights similar to supervised learning.
InfoNCE Loss
One of the earliest contrastive learning losses proposed was the InfoNCE loss by Oord et al. Their paper Representation Learning with Contrastive Predictive Coding proposed the following loss:
where the numerator is essentially the output of a positive pair, and the denominator is the sum of all value of positive and negative pairs. Ultimately, this simple loss forces the positive pairs to have a greater value (pushing the log term to 1 and hence less to 0) and negative pairs further apart.
SimCLR
SimCLR is the first paper to suggest using contrastive loss for self-supervised image…

