Feed-Forward Source-Free Latent Domain Adaptation via Cross-AttentionDownload PDF

26 May 2022 (modified: 05 May 2023)ICML 2022 Pre-training WorkshopReaders: Everyone
Keywords: latent domain adaptation, source-free, cross-attention, meta-learning
TL;DR: Cross-attention based meta-learning approach for fast source-free latent domain adaptation
Abstract: We study the highly practical but comparatively under-studied problem of latent-domain adaptation, where a source model should be adapted to a target dataset that contains a mixture of unlabelled domain-relevant and domain-irrelevant examples. Motivated by the requirements for data privacy and the need for embedded and resource-constrained devices of all kinds to adapt to local data distributions, we further focus on the setting of feed-forward source-free domain adaptation, where adaptation should not require access to the source dataset, and also be back propagation-free. Our solution is to meta-learn a network capable of embedding the mixed-relevance target dataset and dynamically adapting inference for target examples using cross-attention. The resulting framework leads to consistent strong improvements.
0 Replies

Loading