A Hybrid Proxy Self-Distillation Algorithm for Imbalanced Data Classification

12 Oct 2025 (modified: 01 Dec 2025)IEEE MiTA 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: This paper proposes a Hybrid Proxy Self-Distillation (HPSD) algorithm for imbalanced data classification which contains two stages. In the first stage, the backbone network is trained using both the adaptive hybrid proxy contrastive learning and the self-distillation algorithm. In the second stage, with the network weights fixed, the classifier is trained using logit-compensated cross-entropy.
TL;DR: This paper proposes a Hybrid Proxy Self-Distillation (HPSD) algorithm for imbalanced data classification which contains two stages.
Abstract: The contrastive learning methods often neglect the feature similarity information between majority and minority classes. To tackle this issue, this paper proposes a hybrid proxy self-distillation algorithm for imbalanced data classification, termed \textbf{HPSD}, which employs a two stages decoupling training strategy. The first stage consists of a dual-branch architecture. The first branch is an adaptive proxy generation scheme which fully utilizes the limited data from minority classes to constructs proxy representations. This effectively alleviates the negative impact caused by the scarcity of minority samples during mini-batch training, thereby improving the model's generalization capability on underrepresented classes. The second branch introduces a self-distillation strategy to enrich the feature representation of minority class samples by transferring knowledge of feature similarities from majority classes. This enables the model to better capture discriminative patterns for minority classes and enhances its recognition capability. In the second stage, the classifier is further optimized by using a logit-compensated cross-entropy loss, aiming to improve the model’s robustness and adaptability to class imbalance. The experiments demonstrate the effectiveness of the proposed algorithm.
Submission Number: 1
Loading