LADA: Enabling Adaptation of Black-Box LLMs to Dynamic Domain Changes at Test Time

ICLR 2026 Conference Submission10399 Authors

18 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: test-time adaptation, large language model
Abstract: Test-time adaptation (TTA) for black-box large language models (LLMs) seeks to adapt models to target-domain inputs during testing, enabling them to address distribution shifts without access to model parameters. Most existing approaches rely on adapters trained with substantial target-domain data, while such data are often scarce or unreliable. Moreover, the resulting adapters are tightly coupled to the training distribution and readily degrades in effectiveness in dynamic real-world scenarios. To address this problem, we propose a novel framework that leverages a meta-trained adapter to achieve stepwise adaptation of LLMs. Specifically, the adapter is meta-trained on tasks constructed from multiple available datasets, learning transferable skills for flexible adaptation to unseen domains. At test time, it is quickly adapted with only a few target domain examples and guides the LLM stepwise toward domain-appropriate reasoning trajectories through adaptive selection. Experiments on various benchmark datasets validate the effectiveness of the proposed approach.
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 10399
Loading