Subspace-Guided Continual Learning: Hessian Based Stable–Plastic Decomposition for Exemplar-Free Class-Incremental Learning

18 Sept 2025 (modified: 12 Feb 2026)ICLR 2026 Conference Desk Rejected SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Continual Learning, Exemplar-Free Class-Incremental Learning, Computer Vision, Feature Space Decomposition, Stability-Plasticity Dilemma
TL;DR: Our method leverages the feature-space Hessian to decompose features into stable and plastic subspaces, enabling selective regularization to achieve better performance.
Abstract: Exemplar-Free Class-Incremental Learning (EFCIL) presents a significant challenge in continual learning, where a model must learn new classes sequentially without access to old data, making it susceptible to catastrophic forgetting. The core difficulty lies in balancing model stability (preserving old knowledge) and plasticity (acquiring new knowledge). We propose Subspace-Guided Continual Learning (SGCL), a novel method that tackles this dilemma from a geometric perspective. SGCL functionally decomposes the feature space into two orthogonal subspaces: a ''stable subspace'' containing feature directions critical for previous tasks, and a ''plastic subspace'' where new knowledge can be learned with minimal interference. We demonstrate that this decomposition can be efficiently identified by analyzing the feature-space Hessian, where its high-curvature eigendirections define the stable subspace. Building on this, SGCL introduces two synergistic components: 1) Subspace-Guided Regularization (SGR), which imposes strong, curvature-weighted penalties on feature drifts within the stable subspace, and 2) Subspace-Guided Prototype Alignment (SGPA), which adaptively corrects the shift of old-class prototypes to recalibrate the classifier. Extensive experiments on standard benchmarks, including CIFAR-100, Tiny-ImageNet and ImageNet-Subset, show that SGCL significantly outperforms existing state-of-the-art methods. Our work provides a principled and effective approach to EFCIL, offering a new perspective on mitigating forgetting by analyzing the loss landscape structure.
Supplementary Material: pdf
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 11469
Loading