Contrastive learning explained

Vishal Rajput
AIGuys
Published in
6 min readAug 23, 2022

--

We all know that getting hands-on clean and labeled data is extremely rare in the real world. Researchers have been trying to develop methods that work with partially labeled data for years. There are quite a few techniques in Semi-supervised learning that works quite decently with partially labeled data. But still, most of them suffer significantly in the case of Deep learning. In this blog post, we are going to discuss a strategy that doesn’t require any labels, and it's called Contrastive learning. So, without further ado let’s dive deep into the concept of contrastive learning.

Contrastive Learning is a technique that is used generally in the vision tasks lacking labeled data. By using the principle of contrasting samples against each other it learns attributes that are common between data classes and attributes that set apart a data class from another.

As the name suggests, samples are contrasted against each other, and those belonging to the same distribution or class are pulled together in the embedding space. In contrast, those belonging to different distributions are pushed against each other.

80% of the time spent in a supervised learning ML…

--

--