A Primer on Semi-Supervised Learning — Part 2
Semi-Supervised Learning (SSL) is an interesting approach towards addressing the lack of labeled data issue in Machine Learning. SSL utilizes unlabeled data along with a labeled dataset to learn a task. The objective of SSL is to perform better than a supervised learning technique trained using labeled data alone. This is Part 2 of the article series on Semi-Supervised Learning and covers a few initial SSL techniques in detail. Part 3 will cover recent SSL approaches and will be published soon. Part 1 gives an introduction to Semi-Supervised Learning and can be found here.
Outline Part 2
- Consistency Regularization, Entropy Minimization, and Pseudo Labeling
- Examples of realistic perturbations in Vision and Language
- Approaches for Semi-Supervised Learning
— Π model
— Temporal Ensembling - References
Part 3 will cover more recent SSL techniques like Mean Teacher, Unsupervised Data Augmentation, Noisy Student, etc. and will be published soon.
Part 1 of this series is available here