The Closer, The Better: Towards Better Representation Learning for Few-Shot Class-Incremental Learning

20 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: few-shot learning, incremental learning, representation learing
TL;DR: Effective Representation Learning for Few-Shot Class Incremental Learning
Abstract: Aiming to incrementally learn new classes with only few samples while preserving the knowledge of base (old) classes, few-shot class-incremental learning (FSCIL) faces several challenges, such as overfitting and catastrophic forgetting. To bypass the issues, many works have employed a non-parametric classifier: representing each class with the average of features obtained with a fixed feature extractor trained on base classes. Under such formulation, representation learning is critical to tackle the unique challenges of FSCIL: (1) the transferability of the learned representation to new knowledge, (2) the discriminability between all classes, regardless of old or new. Recent advances in representation learning, such as contrastive learning, have greatly improved the transferability, which is often attributed to the spread of intra-class features. However, we observe that solely improving the transferability can harm the discriminability of FSCIL models, as too much spread of features can degrade the quality of the feature-mean class representation. Upon the observation and further experimental analysis, we claim that not only we need to increase the intra-class distance, but we also need to decrease the inter-class distance. Trying to secure the spread of features and discriminability within a more confined space due to small inter-class distances, the learned representation strikes a good balance between the transferability and discriminability. The strong performance, without any weight update while learning new classes, demonstrates the effective discriminability and transferability of our new representation, founded upon our seemingly counter-intuitive claim: the-Closer-the-Better (CnB).
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 2114
Loading