Automatic Domain Adaptation by Transformers in In-Context Learning

Published: 18 Jun 2024, Last Modified: 21 Jul 2024ICML 2024 Workshop ICL PosterEveryoneRevisionsBibTeXCC BY 4.0
Track: short paper (up to 4 pages)
Keywords: domain adaptation
TL;DR: We introduce in-context domain adaptation
Abstract: Selecting or designing an appropriate domain adaptation algorithm for a given problem remains challenging. This paper presents a Transformer model that can provably approximate and opt for domain adaptation methods for a given dataset in the in-context learning framework, where a foundation model performs new tasks without updating its parameters at test time. Specifically, we prove that (i) Transformers can approximate instance-based and feature-based unsupervised domain adaptation algorithms, and (ii) automatically select the approximated algorithms suited for a given dataset. Numerical results indicate that in-context learning demonstrates an adaptive domain adaptation surpassing existing methods.
Submission Number: 34
Loading