Alleviating Label Shift Through Self-trained Intermediate Distribution: Theory and Algorithms

19 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Label Shift; Self-trained; Intermediate Distribution
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: As an obstacle towards real-world problems with the changing environment, label shift, which assumes the source and target label marginal distributions differ, loosens the homogeneous distribution assumption in classical learning scenarios. To correct the label shift, importance weighting is one of the most popular strategies with rigorous theoretical guarantees. However, the importance weight estimation of most existing methods results in high variance under large label shift or few source samples. In this paper, we introduce an ideal intermediate distribution instead of the source distribution to reduce the variation to the target label distribution. Our approach learns a self-trained intermediate distribution constructed from the labeled source and unlabeled target samples to approximate the ideal intermediate distribution. It balances the bias from pseudo target labels and the variance from importance weighting. Besides, we prove the sample complexity and generalization guarantees for our approach, which has a tighter generalization bound than the existing label shift methods under mild conditions. Extensive experimental results validate the effectiveness of our approach over existing state-of-the-arts methods.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 1906
Loading