Generalizable Transferability Estimation of Foundation Vision Models via Implicit Learning

13 Sept 2024 (modified: 14 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Transferability Estimation, Transfer Learning
Abstract: Transferability estimation aims to identify the most suitable model from a collection of pre-trained models for specific downstream tasks, playing a crucial role in the success of the pre-training and fine-tuning paradigm. However, the recent proliferation of pre-trained models with diverse architectures and training strategies poses significant challenges for transferability estimation due to discrepancies in intrinsic model characteristics, making it difficult for existing methods to accurately simulate embedding space evolution within feasible computational limits. To address these challenges, we propose an Implicit Transferability Modeling (ITM) paradigm that incorporates an implicit modeling strategy for the intrinsic properties of pre-trained models, enabling more accurate transferability estimation. ITM employs a Divide-and-Conquer Adaptation (DCA) process to efficiently model the transfer process, reducing both learning complexity and computational cost. Additionally, we introduce a Pseudo-Clustering-based Optimization (PCO) strategy that eliminates the need for extensive fine-tuning, enabling effective estimation without intensive retraining. Our method significantly outperforms state-of-the-art approaches, achieving notable improvements across ten widely used benchmarks and demonstrating its effectiveness and generalizability in enabling accurate and efficient model selection for downstream tasks.
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 415
Loading