Abstract: Few-shot class-incremental learning (FSCIL) aims to incrementally learn new classes with limited samples while avoiding the forgetting of previously learned ones. Nevertheless, the limited samples in new classes often lead the model towards overfitting and may trigger catastrophic forgetting. In response to these challenges, we propose the Prototype Optimization-based Method (POM) for FSCIL. Given the critical role of prototypes in classification, POM aims to optimize classification performance from the perspective of enhancing prototype representativeness. First, the quality of prototypes is directly determined by the capacity of the feature extractor. A powerful feature extractor is capable of extracting more discriminative features, while the accuracy of the prototype relies on the high-quality features provided by the feature extractor. Therefore, in the basic training phase, we design a hybrid loss function to adequately train the feature extractor, enhancing the discriminative power of the prototype representation. Second, we propose a prototype optimization strategy that dynamically adjusts the positions of prototypes by identifying highly similar pairs in the feature space, ensuring sufficient separation between them and reducing confusion between prototypes of new and old classes. Experimental results on miniImageNet, CIFAR100, and CUB200 show that POM achieves outstanding performance across several key metrics, particularly in terms of accuracy and performance retention, significantly surpassing existing methods and demonstrating its effectiveness and advantages in incremental learning tasks. Compared to TOPIC, POM achieves a 20.40% improvement in average accuracy on miniImageNet.
External IDs:dblp:journals/apin/JiangFL25
Loading