A Few Words on Representation Learning

How can we learn good representations without relying on human-annotated data?

Encora
Encora Technology Practices
2 min readMay 5, 2021

--

Image by Ahmed Gad from Pixabay

The following brief abstract provides highlights about this featured article on Representation Learning, written by Thalles Silva and posted in Towards Data Science.

In this article the author discusses the importance of learning good representations from images without having labels to guide the learning process. These representations are particularly special because they can be used to solve a large number of related problems. This process is called transfer learning, and it is largely used in both academia and industry. Transfer learning relates to the idea of using the knowledge a deep neural network has gained by solving a particular task (image classification, for instance) to solve other related tasks such as image segmentation of pose estimation. The important thing to note here is that a lot of important tasks, such as pose estimate, don’t have a lot of labeled data available, which limits the solutions we can find with deep neural networks because this class of machine learning models is very data-hungry. Thus, by using the knowledge acquired from other tasks, we can alleviate the problem of low annotated data and achieve reasonable solutions.

If Representation Learning is relevant to your own work, you can read Thalles’ full article here. Enjoy!

--

--

Encora
Encora Technology Practices

Encora helps define your strategic innovation roadmap, build capabilities to accelerate, fast track development and maximize market adoption.