Scouting for Potential LLMs: A Preliminary Assessment of Domain Adaptability for Supervised Fine-Tuning

ICLR 2026 Conference Submission15931 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Large Language Models, Supervised Fine-Tuning, Model Selection, Representation Analysis, Performance Prediction
TL;DR: We propose Potential Ranking, a framework that predicts LLM fine-tuning potential by analyzing internal representation patterns, enabling more efficient model selection without costly trial-and-error.
Abstract: Large Language Models (LLMs) have demonstrated remarkable performance across diverse tasks, but their effectiveness in domain-specific applications depends on how well the Supervised Fine-Tuning (SFT) data aligns with the model's pre-trained knowledge. Since SFT doesn't always improve performance, developers must resort to costly trial-and-error to find optimal model-dataset matches. To address this problem, we introduce Potential Scout, a lightweight framework that diagnoses a model's SFT suitability without any training. Our method builds a Thinking Curve Matrix (TCM) that tracks how hidden representations evolve across transformer layers when processing SFT samples. From TCM, we derive two diagnostic indicators: Activation Growth Score, which captures how well the model distinguishes semantic differences, and Layer Coverage Score, which measures representational stability within the model. Combined with these indicators and pre-SFT benchmark scores, we designed two complementary scouting modes: In-dataset Scout uses prior SFT experience on the same dataset, while Cross-dataset Scout works on entirely new datasets. Across 18 LLMs and 8 datasets, Potential Scout identifies top-performing candidates in minutes, substantially reducing the search space for SFT and eliminating extensive exploratory experiments in model selection.
Supplementary Material: zip
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 15931
Loading