Overcoming Data Inequality across Domains with Semi-Supervised Domain Generalization

19 Sept 2023 (modified: 08 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Domain Generalization, Semi-Supervised Learning, Uncertainty, Data Inequality, Representation Learning
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: In this paper, we address a representative case of data inequality problem across sources and populations, termed Semi-Supervised Domain Generalization.
Abstract: While there have been considerable advancements in machine learning driven by extensive datasets, a significant disparity still persists in the availability of data across various sources and populations. This inequality across domains poses challenges in modeling for those with limited data, which can lead to profound practical and ethical concerns. In this paper, we address a representative case of data inequality problem across domains termed Semi-Supervised Domain Generalization (SSDG), in which only one domain is labeled while the rest are unlabeled. We propose a novel algorithm, ProUD, designed for progressive generalization across domains by leveraging domain-aware prototypes and uncertainty-adaptive mixing strategies. Our experiments on three different benchmark datasets demonstrate the effectiveness of ProUD, outperforming existing baseline models in domain generalization and semi-supervised learning.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 1702
Loading