Abstract: Model-based class incremental learning (CIL) methods aim to address the challenge of catastrophic forgetting by retaining certain parameters and expanding the model architecture. However, retaining too many parameters can lead to an overly complex model, increasing inference overhead. Additionally, compressing these parameters to reduce the model size can result in performance degradation. To tackle these challenges, we propose a novel three-stage CIL framework called Localized and Layered Reparameterization for Incremental Learning (L3Net). The rationale behind our approach is to balance model complexity and performance by selectively expanding and optimizing critical components. Specifically, the framework introduces a Localized Dual-path Expansion structure, which allows the model to learn simultaneously from both old and new features by integrating a fusion selector after each convolutional layer. To further minimize potential conflicts between old and new features, we implement the Feature Selectors Gradient Resetting method, which sparsifies the fusion selectors and reduces the influence of redundant old features. Additionally, to address classification bias resulting from class imbalance, we design the Decoupled Balanced Distillation technique and apply Logit Adjustment to more effectively retain knowledge from the rehearsal set. Extensive experiments demonstrate that our L3Net framework outperforms state-of-the-art methods on widely used benchmarks, including CIFAR-100 and ImageNet-100/1000.
Loading