Semi-supervised Long-tailed Recognition using Alternate Sampling

23 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: datasets and benchmarks
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Long-tailed Recognition, Semi-supervised Learning
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Long tailed recognition is confronted by two interven- ing challenges, i.e., the sample scarcity in the tail classes and the imbalanced class distribution. The class geome- try in feature space mainly suffers from the data scarcity, while imbalance distribution biases the decision boundary of classes. Previous work makes assumptions on the under- neath geometric structure of the tail classes to address the data scarcity challenge, and resorts to class balanced sam- pling or reweighting to address the data imbalance chal- lenge. We advocate to leverage the readily available un- labeled data in a semi-supervised setting to approach to long tailed recognition. An alternate sampling strategy is then introduced to overcome the two challenges in a single framework. The feature embedding (geometric structure) and classifier are updated in an iterative fashion. The extra unlabeled data, regularized by a consistency loss, leads to a better geometric structure. The class-balanced sampling is implemented to train the classifier such that it is not af- fected by the imbalance distribution or the quality of pseudo labels. We demonstrate significant accuracy improvements over other competitive methods on two datasets, where we improve on tail classes without much, if at all, degradations on head classes.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6869
Loading