Improving the Transferability of Supervised Pretraining with an MLP ProjectorDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: transferability, supervised learning, MLP projector
Abstract: The pretrain-finetune paradigm is a classical pipeline in visual learning. Recent progress on unsupervised pretraining methods showed superior transfer performance to their supervised counterparts. While a few works attempted to explore the underlying mechanisms, the reasons behind the transferability gaps still have not been fully explored. This paper reveals that the multilayer perceptron (MLP) projector is a key factor for the better transferability of unsupervised pretraining. Based on this observation, we attempt to close the transferability gap between supervised pretraining and unsupervised pretraining by adding an MLP projector before the classifier of supervised pretraining. Our analysis indicates that the MLP projector can help retain intra-class variation of visual features, decrease the feature distribution distance between pretraining dataset and evaluation dataset, and reduce feature redundancy for effective adaptation to new tasks. Extensive experiments demonstrate that the added MLP projector significantly boosts the transferability of supervised pretraining, \emph{e.g.,} \textbf{+7.2\%} top-1 accuracy on the unseen class generalization task and \textbf{+5.7\%} top-1 accuracy on 12-domain classification tasks, making supervised pretraining even better than unsupervised pretraining.
5 Replies

Loading