Abstract: Few-Shot Class Incremental Learning (FSCIL) is a recently introduced Class Incremental Learning (CIL) setting that operates under more constrained assumptions: only very few samples per class are available in each incremental session, and the number of samples/classes is known ahead of time. Due to limited data for class incremental learning, FSCIL suffers more from over-fitting and catastrophic forgetting than general CIL. In this paper we study leveraging the advances due to self-supervised learning to remedy overfitting and catastrophic forgetting and significantly advance the state-of-the-art FSCIL. We explore training a lightweight feature fusion plus classifier on a concatenation of features emerging from supervised and self-supervised models. The supervised model is trained on data from a base session, where a relatively larger amount of data is available in FSCIL. Whereas a self-supervised model is learned using an abundance of unlabeled data. We demonstrate a classifier trained on the fusion of such features outperforms classifiers trained independently on either of these representations. We experiment with several existing self-supervised models and provide results for three popular benchmarks for FSCIL including Caltech-UCSD Birds-200-2011 (CUB200), miniImageNet, and CIFAR100 where we advance the state-ofthe-art for each benchmark. Code is available at: https://github.com/TouqeerAhmad/FeSSSS
0 Replies
Loading