ReST: Remarkably Simple Transferability Estimation

ICLR 2026 Conference Submission15068 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Transferability Estimation
Abstract: Existing transferability estimation methods for pre-trained neural networks suffer from method complexity, requiring extensive target data and labeled samples to predict transfer performance. We introduce ReST, a remarkably simple yet effective approach: It only requires a small subset of unlabeled samples from target data and analyzes the stable rank—a robust measure of matrix effective dimensionality—of the final layer representations. We demonstrate that this simple metric strongly correlates with transfer learning success across diverse tasks and architectures. Through comprehensive experiments on vision transformers and CNNs across multiple downstream tasks, we show that this remarkably simple approach not only matches but often exceeds the performance of sophisticated existing methods. ReST achieves 4.6\% improvement over state-of-the-art methods, establishing stable rank as a powerful predictor for transferability assessment and fundamentally challenging the need for complex analysis in transfer learning evaluation. The code is made anonymously available at https://anonymous.4open.science/r/random-07C2 to ensure reproducibility of our results.
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 15068
Loading