Abstract: The intricate and ever-changing nature of the real world imposes greater demands on neural networks, necessitating the rapid assimilation of fleeting new concepts as they arise. Consequently, a novel learning paradigm has emerged, namely, few-shot class-incremental learning (FSCIL), which aims to continuously update knowledge of novel categories with insufficient instances while avoiding catastrophic forgetting of previous knowledge. However, recent FSCIL methods encountered significant performance limitations due to the low-quality latent representation spaces obtained from base session. To this end, this paper introduces a novel FSCIL method, Adapt and REfine (ARE). Specifically, ARE initially strengthens the latent space through the powerful representational capabilities of pre-trained models (PTMs). Subsequently, we further adapt and refine the feature space and prototypes to promote the enhancement of FSCIL performance. Extensive experiments on benchmarks such as CIFAR100, mini-ImageNet, and CUB200 validate the effectiveness of the proposed method.
Loading