Sitemap
TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

Contrastive Learning in 3 Minutes

The exponential progress of contrastive learning in self-supervised tasks

3 min readJan 25, 2022

--

Deep learning research has been steered towards the supervised domain of image recognition tasks, many have now turned to a much more unexplored territory: performing the same tasks through a self-supervised learning manner. One of the cornerstones that lead to the dramatic advancements in this seemingly impossible task is the introduction of contrastive learning losses. This article dives into some of the recently proposed contrastive losses that have pushed the results of unsupervised learning to heights similar to supervised learning.

InfoNCE Loss

One of the earliest contrastive learning losses proposed was the InfoNCE loss by Oord et al. Their paper Representation Learning with Contrastive Predictive Coding proposed the following loss:

where the numerator is essentially the output of a positive pair, and the denominator is the sum of all value of positive and negative pairs. Ultimately, this simple loss forces the positive pairs to have a greater value (pushing the log term to 1 and hence less to 0) and negative pairs further apart.

SimCLR

Press enter or click to view image in full size
Figure 1. SimCLR Overview. Image retrieved from https://arxiv.org/abs/2002.05709.

SimCLR is the first paper to suggest using contrastive loss for self-supervised image…

--

--

TDS Archive
TDS Archive

Published in TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

Tim Cheng
Tim Cheng

Written by Tim Cheng

Oxford CS | Top Writer in AI | Posting on Deep Learning and Vision

No responses yet