From Small to Large: In-Context Learning as a New Paradigm for Domain Generalization

Guanglin Zhou, Zhongyi Han, Shaoan Xie, Shiming Chen, Biwei Huang, Liming Zhu, Xinbo Gao, Lina Yao, Salman H. Khan

Published: 2026, Last Modified: 08 Mar 2026Int. J. Comput. Vis. 2026EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Domain generalization (DG) ensures machine learning models remain robust against distribution shifts from source to unseen target domains. DG research has evolved from small-scale models tailored with specialized loss functions, to parameter-efficient fine-tuning of moderately large models, and now toward leveraging large multimodal models (LMMs) that are pre-trained on vast datasets. Despite their zero-shot capabilities, LMMs face challenges in adapting to specialized scenarios (e.g., healthcare) without costly retraining. In this work, we propose ICL-DG, a novel DG framework that integrates in-context learning (ICL) to help adaptability under distribution shifts. We theoretically conceptualize demonstration selection with a Bayesian inference perspective as the process of providing effective conditional priors. To realize this, we introduce the class-conditioned contrastive invariance (CCI) principle, which reshapes the embedding space so that same-class samples from different domains cluster together while maintaining separation between distinct classes. This approach enables the selection of demonstrations based on stable class-level semantics rather than domain-specific artifacts, thereby guiding LMMs under distribution shifts without parameter updates. Empirical evaluations on four benchmarks, including Camelyon17 and HAM10000, demonstrate the efficacy of ICL-DG, with improvements of 34.2% and 16.9% in 7-shot accuracy over the zero-shot baseline, respectively. These results highlight the potential of pairing ICL with invariant demonstration selection to advance LMMs-based DG, particularly in high-stakes domains like healthcare.Our code is available at: https://github.com/jameszhou-gl/ICL-DG.
Loading