Layer Removal for Transfer Learning with Deep Convolutional Neural Networks

Weiming Zhi, Zhenghao Chen, Henry Wing Fung Yueng, Zhicheng Lu, Seid Miad Zandavi, Yuk Ying Chung

Research output: Book chapter/Published conference paperConference paperpeer-review

2 Citations (Scopus)

Abstract

It is usually difficult to find datasets of sufficient size to train Deep Convolutional Neural Networks (DCNNs) from scratch. In practice, a neural network is often pre-trained on a very large source dataset. Then, a target dataset is transferred onto the neural network. This approach is a form of transfer learning, and allows very deep networks to achieve outstanding performance even when a small target dataset is available. It is thought that the bottom layers of the pre-trained network contain general information, which are applicable to different datasets and tasks, while the upper layers of the pre-trained network contain abstract information relevant to a specific dataset and task. While studies have been conducted on the fine-tuning of these layers, the removal of these layers have not yet been considered. This paper explores the effect of removing the upper convolutional layers of a pre-trained network. We empirically investigated whether removing upper layers of a deep pre-trained network can improve performance for transfer learning. We found that removing upper pre-trained layers gives a significant boost in performance, but the ideal number of layers to remove depends on the dataset. We suggest removing pre-trained convolutional layers when applying transfer learning on off-the-shelf pre-trained DCNNs. The ideal number of layers to remove will depend on the dataset, and remain as a parameter to be tuned.

Original languageEnglish
Title of host publicationNeural Information Processing - 24th International Conference, ICONIP 2017, Proceedings
EditorsDongbin Zhao, El-Sayed M. El-Alfy, Derong Liu, Shengli Xie, Yuanqing Li
PublisherSpringer-Verlag Italia Srl
Pages460-469
Number of pages10
ISBN (Print)9783319700953
DOIs
Publication statusPublished - 2017
Event24th International Conference on Neural Information Processing, ICONIP 2017 - Guangzhou, China
Duration: 14 Nov 201718 Nov 2017

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume10635 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference24th International Conference on Neural Information Processing, ICONIP 2017
Country/TerritoryChina
CityGuangzhou
Period14/11/1718/11/17

Fingerprint

Dive into the research topics of 'Layer Removal for Transfer Learning with Deep Convolutional Neural Networks'. Together they form a unique fingerprint.

Cite this