Keywords: Computer Vision, Incremental Learning, Granular Ball, Feature Representation, Multi-granularity
TL;DR: BallIL mitigates catastrophic forgetting in exemplar-free class-incremental learning using multi-granularity granular ball representations and Synergy Drift Estimation, outperforming prior methods across six datasets.
Abstract: Catastrophic forgetting remains a critical challenge in deep learning, particularly when samples from previously encountered classes are unavailable. This challenge drives advances in Exemplar-Free Class-Incremental Learning (EFCIL). However, incremental learning approaches typically use a single, fixed granularity for class representation, such as prototypes or features. We show that class representations exhibit varying granularity both within and across tasks, with the granularity of new classes gradually increasing as tasks progress, potentially leading the model to bias toward new classes during classification. To address this, we propose Granular Ball Incremental Learning (BallIL), which uses granular ball representation for multi-granularity class description and progressively expands the granularity of old classes to balance inter-task differences. Based on class concepts provided by the granular ball representation, we design concept-informed representation and decision uncertainties to assess loss for classification tasks. To address the issue of outdated class representations in new tasks due to feature drift, we develop a Synergy Drift Estimation (SDE) module for BallIL, ensuring that past concepts remain effective in the new representation space. Our extensive experiments across six datasets consistently highlight the superior performance of our method compared to current state-of-the-art methods. The code will be released.
Supplementary Material: zip
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 2702
Loading