Balancing Mixed Labels: Mixup meets Neural Collapse in Imbalanced Learning

ICLR 2026 Conference Submission16984 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: imbalanced learning, mixup, neural collapse
TL;DR: We reveal issues of Mixup in imbalanced learning and propose a sampler (BMLS) and a classifier (MS), which balance mixed labels and treat them as singleton labels
Abstract: *Minority collapse*, where minor classes become indistinguishable, is a key challenge in imbalanced learning, addressed by methods like Mixup with class-balanced sampling. In parallel, a simplex equiangular tight frame from Neural Collapse (NC) has emerged as an effective frame in the classifier to mitigate minority collapse. While NC has been studied in both Mixup and imbalanced learning independently, its combination remains unexplored, particularly regarding the balance of mixed labels. We investigate this overlooked factor and pose the question: *Is the mixed label balance important for alleviating minority collapse?* Our analysis reveals that (i) mixed labels should be balanced, and (ii) in this setting, interpreting mixed labels as singletons is beneficial. Building on the analysis, we propose a balanced mixed label sampler and a mixed-singleton classifier, which balance mixed labels and treat them as singleton labels. Through theoretical analysis, visualization, and ablation studies, we demonstrate the effectiveness of our approach. Experiments on standard benchmarks further confirm consistent performance gains, highlighting the importance of balancing mixed labels in imbalanced learning.
Supplementary Material: zip
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 16984
Loading