OrCo: Towards Better Generalization via Orthogonality and Contrast for Few-Shot Class-Incremental Learning

Published: 01 Jan 2024, Last Modified: 19 May 2025CVPR 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Few-Shot Class-Incremental Learning (FSCIL) intro-duces a paradigm in which the problem space expands with limited data. FSCIL methods inherently face the chal-lenge of catastrophic forgetting as data arrives incremen-tally, making models susceptible to overwriting previously acquired knowledge. Moreover, given the scarcity of la-beled samples available at any given time, models may be prone to overfitting and find it challenging to strike a bal-ance between extensive pretraining and the limited incre-mental data. To address these challenges, we propose the OrCo framework built on two core principles: features' orthogonality in the representation space, and contrastive learning. In particular, we improve the generalization of the embedding space by employing a combination of su-pervised and self-supervised contrastive losses during the pretraining phase. Additionally, we introduce OrCo loss to address challenges arising from data limitations during in-cremental sessions. Through feature space perturbations and orthogonality between classes, the OrCo loss maxi-mizes margins and reserves space for the following incre-mental data. This, in turn, ensures the accommodation of incoming classes in the feature space without compromising previously acquired knowledge. Our experimental results showcase state-of-the-art performance across three bench-mark datasets, including mini-ImageNet, CIFAR100, and CUB datasets. Code is available at: https://github.com/noorahmedds/OrCo.
Loading