Impact of ImageNet Model Selection on Domain Adaptation

Synced
SyncedReview
Published in
4 min readMar 23, 2020

Content provided by Youshan Zhang, the first author of the paper Impact of ImageNet Model Selection on Domain Adaptation.

It is known that training and updating of the machine learning model depend on data annotation. We often have a serious problem that lacks labeled data for training in the real world. Therefore, it is often necessary to transfer knowledge from an existing labeled domain to an unlabeled new domain. However, due to the phenomenon of data bias or domain shift, machine learning models do not generalize well from an existing domain to a novel unlabeled domain.

Domain adaptation has been a promising method to mitigate the domain shift problem. Recently, deep neural networks have achieved great success in some standard image recognition datasets. However, little work addresses how features from different deep neural networks affect the domain adaptation problem. Existing methods often extract deep features from one ImageNet model without exploring other neural networks. In this paper, we investigate how different ImageNet models affect transfer accuracy on domain adaptation problems. We extract features from sixteen distinct pre-trained ImageNet models and examine the performance of twelve benchmarking methods when using the features. Extensive experimental results show that a higher accuracy ImageNet model produces better features, and leads to higher accuracy on domain adaptation problems.

What’s New: We are the first to examine how different ImageNet models affect unsupervised domain transfer accuracy. Differing from most existing work, which trains adaptation methods using one pre-trained model; we extracted features from sixteen different ImageNet models, and test these features using twelve domain adaptation methods. In addition, we search the architecture of each neural network to find the best layer for feature extraction.

How it works: We extract features from sixteen distinct pre-trained ImageNet models and evaluate them using twelve methods across three benchmark datasets. We then compute the correlation and R square statics between domain adaptation performance and ImageNet classification performance.

Key Insights:

  1. Features from a higher-performing ImageNet-trained model are more valuable than those from a lower-performing model for unsupervised domain adaptation. Therefore, domain adaptation methods should take advantage of features from the high accuracy ImageNet model instead of frequently used ResNet50 features.
  2. The layer prior to the last fully connected layer is the best layer for feature extraction in unsupervised domain adaptation.

Behind the scenes: Features from a higher-performing ImageNet-trained model are more valuable than those from a lower-performing model for unsupervised domain adaptation. The layer prior to the last fully connected layer is the best layer for feature extraction.

The paper Impact of ImageNet Model Selection on Domain Adaptation is on arXiv.

Meet the authors Youshan Zhang and Brian D. Davison from Lehigh University.

Lehigh University is a private research university in Bethlehem, Pennsylvania. It was established in 1865 by businessman Asa Packer. Lehigh University has four colleges: the P.C. Rossin College of Engineering and Applied Science, the College of Arts and Sciences, the College of Business and Economics, and the College of Education. It is organized into three contiguous campuses, including the Asa Packer Campus, the Mountaintop Campus, and Goodman Campus.

Share Your Research With Synced

Share My Research is Synced’s new column that welcomes scholars to share their own research breakthroughs with over 1.5M global AI enthusiasts. Beyond technological advances, Share My Research also calls for interesting stories behind the research and exciting research ideas. Share your research with us by clicking here.

To highlight the contributions of women in the AI industry, Synced introduces the Women in AI special project this month and invites female researchers from the field to share their recent research works and the stories behind the idea. Join our conversation by clicking here.

Thinking of contributing to Synced Review? Synced’s new column Share My Research welcomes scholars to share their own research breakthroughs with global AI enthusiasts.

We know you don’t want to miss any story. Subscribe to our popular Synced Global AI Weekly to get weekly AI updates.

Need a comprehensive review of the past, present and future of modern AI research development? Trends of AI Technology Development Report is out!

2018 Fortune Global 500 Public Company AI Adaptivity Report is out!
Purchase a Kindle-formatted report on Amazon.
Apply for Insight Partner Program to get a complimentary full PDF report.

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global