Open-World Test-Time Training: Self-Training with Contrastive Learning

20 Sept 2024 (modified: 12 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Open-World, Test-Time Training, Self-Training, Contrastive Learning, Transfer learning
Abstract: Traditional test-time training (TTT) methods, while addressing domain shifts, often assume a consistent class set that limits their applicability in real-world scenarios with infinite variety. Open-World Test-Time Training (OWTTT) addresses the challenge of generalizing deep learning models to unknown target domain distributions, especially in the presence of strong Out-of-Distribution (OOD) data. Existing TTT methods often struggle to maintain performance when confronted with strong OOD data. In OWTTT, the primary focus has been on distinguishing between strong and weak OOD data. However, during the early stages of TTT, initial feature extraction is hampered by interference from strong OOD and corruptions, leading to reduced contrast and premature classification of certain classes as strong OOD. To handle this problem, we introduce Open World Dynamic Contrastive Learning (OWDCL), an innovative approach that leverage contrastive learning to augment positive sample pairs. This strategy not only enhances contrast in the early stages but also significantly enhances model robustness in later stages. In comparison datasets, our OWDCL model achieves state-of-the-art performance.
Supplementary Material: zip
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 2143
Loading