Hard-Negative Prototype-Based Regularization for Few-Shot Class-Incremental Learning

TMLR Paper5039 Authors

05 Jun 2025 (modified: 10 Oct 2025)Decision pending for TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Few-shot class-incremental learning (FSCIL)---involving abundant base training data followed by novel classes with limited labeled samples---poses challenges such as catastrophic forgetting and overfitting, leading to significant performance degradation across incremental sessions. As a remedy, recent work focuses on minimizing the interference of embeddings between base and incremental classes. However, previous studies have not explicitly considered variation in discriminative difficulty across samples and classes, leaving room for improvement: we observe that hard-negative (i.e., difficult to discriminate from the label) samples and classes significantly affect FSCIL performance, whereas easy ones have little impact. To this end, we propose a hard-negative prototype-based regularization approach that enhances discrimination between similar classes by imposing a penalty margin between each sample and its most similar class prototypes based on cosine similarity. To select hard-negative prototypes, we explore two distinct mining strategies: dynamic selection that leverages the model's decision boundary, and static selection that utilizes a pre-defined class-wise similarity matrix derived from external sources such as pre-trained models. We evaluate our approach on three widely used benchmarks, miniImageNet, CIFAR100, and CUB200, achieving state-of-the-art performance on each. Comprehensive analyses demonstrate that our proposed method enhances intra-class cohesion and inter-class separability of embeddings, both of which are crucial for FSCIL to better accommodate novel classes. The code will be made publicly available upon publication.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Han-Jia_Ye1
Submission Number: 5039
Loading