How does Transfer Learning work?

Deep Blade
3 min readSep 27, 2021

--

In this tutorial, you will learn,

  • What is transfer learning?
  • How works Transfer Learning?
  • Advantages of Transfer Learning
  • Pre-Trained models
  • Transfer learning applications examples

What is Transfer Learning

The simple idea of transfer learning is, After Neural Network learned from one task, apply that knowledge to another related task. It is a powerful idea in Deep Learning. You all know in Computer vision and Natural Language Processing tasks required high computational costs and time. So, we can simplify those tasks using Transfer Learning.
For example, after we trained a model using images to classify Cars, then that model we can use to recognize other vehicles like trucks.

The difference in Transfer Learning

In traditional Deep Learning, models are based on specific tasks. For Example, if we work with two problems like Car classifier and Truck classifier, we need to implement two separate models for each one. But in Transfer Learning, after we trained a Car classifier, that knowledge was used to build a Truck Classifier.

How works Transfer Learning

For Example, in computer vision tasks, for instance, neural organizations ordinarily attempt to distinguish edges in the previous layers, shapes in the center layer, and some undertaking explicit elements in the later layers. In transfer learning, the early and center layers are utilized and we just drop the last layers. Then we can apply new layers to our CNN model corresponding to our new task.

Advantages of Transfer Learning

There are some advantages in Transfer Learning.

1. No need for more Data

We already know, especially in deep learning we need a lot of data to train neural networks. But if we used Transfer Learning, we already have a pre-trained model. So, we can build our model with a little amount of data without train the whole model.

2. Less time to train

Some Deep Learning models take days or even weeks to train from scratch. But in transfer Learning, we already have a pre-trained model. As a result, training time will reduce.

3. Can learn from simulations

This is turning into the standard with self-driving vehicles, as permitting a totally undeveloped model to figure out how to drive with a genuine vehicle presents clear security dangers. Transfer learning permits a model to initially figure out how to drive in a virtual environment before truly dealing with a real vehicle.

Pre-trained Models

Now you know we need pre-trained models to train our model, as in Transfer learning. Fortunately, there are a lot of pre-trained models available to use.

Popular pre-trained models used in Computer Vision

VGG16 (Very Deep Convolutional Neural Networks for large scale image recognition)

VGG19 (Very Deep Convolutional Neural Networks for large scale image recognition)

ResNet50 (Deep residual learning for image recognition)

Xception (Deep Learning with Depth wise Separable Convolutions)

Popular pre-trained models used in Natural Language Processing

Word2vec

GloVe (Global Vectors for Word Representation)

fastText

Transfer learning Applications Examples

Medical Imaging

A model that is trained for CT scans, the same model used for MRI Scans.

Spam filtering

A model that is trained for categorizing emails, the Same model used to categorize emails spam or not.

Vehicle Detection

A model that is trained to detect Cars, the same model used to detect other vehicles.

Text Classification

Pre-trained word embedding model used to solve text classification problems.

Books:

Summary,

  • Transfer learning is a powerful idea in deep learning. Transfer Learning means, a pre-trained model used for another related task.
  • If we use transfer learning, no need for more data and time to train our model.
  • Accordingly, transfer learning becomes very important in computer vision and natural language processing.
  • Currently, There are many different applications have been created using the idea of transfer learning in different fields.

Learn more about Machine Learning

--

--