Layover Intermediate Layer for Multi-Label Classification in Efficient Transfer LearningDownload PDF

Published: 20 Oct 2022, Last Modified: 05 May 2023HITY Workshop NeurIPS 2022Readers: Everyone
Keywords: Multi-label Classification, Efficient Transfer Learning, Representation Learning
TL;DR: This paper achieves efficient transfer learning by utilizing intermediate representation and feature extracted from pre-trained network for multi-label classification..
Abstract: Transfer Learning (TL) is a promising technique to improve the performance of a target task by transferring the knowledge of models trained on relevant source datasets. With the advent of advanced depth models, various methods of exploiting pre-trained depth models at a large scale have come into the limelight. However, for multi-label classification tasks, TL approaches suffer from performance degradation in correctly predicting multiple objects in an image with significant size differences. Since such a hard instance contains imperceptible objects, most pre-trained models lose their ability during downsampling. For the hard instance, this paper proposes a simple but effective classifier for multiple predictions by using the hidden representations from the fixed backbone. To this end, we mix the pre-logit with the intermediate representation with a learnable scale. We show that our method is effective as fine-tuning with few additional parameters and is particularly advantageous for hard instances.
Supplementary Material: zip
3 Replies

Loading