Continual Test-Time Adaptation by Leveraging Source Prototypes and Exponential Moving Average Target Prototypes

18 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Test Time Adaptation, Unsupervised Domain Adaptation, Continual Learning
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: A method to tackle continual test-time adaptation by utilizing prototypes of the source and the target domains.
Abstract: Continual Test-Time Adaptation (CTA) is a challenging task that aims to adapt a source pre-trained model to continually changing target domains. In the CTA setting, the model does not know when the target domain changes, thus facing a drastic change in the distribution of streaming inputs during the test-time. The key challenge is to keep adapting the model to the continually changing target domains in an online manner. To keep track of the changing target domain distributions, we propose to maintain an exponential moving average (EMA) target prototype for each class with reliable target samples. We exploit those prototypes to cluster the target features class-wisely. Moreover, we aim to align the target distributions to the source distribution by minimizing the distance between the target feature and its corresponding pre-computed source prototype. We empirically observe that our simple proposed method achieves reasonable performance gain when applied on existing CTA methods. Furthermore, we assess the adaptation time between existing methodologies and our novel approach, demonstrating that our method can gain noteworthy performance without substantial adaptation time overhead.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 1201
Loading