CNSP: Consistent Null-Space Projection for Principled Prompt-Based Continual Learning

ICLR 2026 Conference Submission17835 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Continual Learning, Visual Prompt Tuning, Gradient Projection, Anti-forgetting
Abstract: Prompt-based continual learning has recently shown strong empirical progress, yet its theoretical underpinnings remain incomplete. Prior work such as NSP$^2$ provides sufficient conditions for performance preservation for visual prompt tuning via null-space projection and achieves strong empirical results, but its reliance on simplifying assumptions in MHSA and LayerNorm undermines robustness and interpretability. In this paper, we revisit the problem from a matrix-level perspective and propose Consistent Null-Space Projection (CNSP). Our framework introduces: (i) rigorous per-head derivations under MHSA; (ii) a matrix-form characterization of LayerNorm; (iii) a relaxed prompt variance constraint that is more stable in practice; and (iv) refined sufficient conditions enforced via null-space projection that extend naturally to classification heads, ensuring end-to-end task performance preservation. Extensive experiments on multiple benchmarks demonstrate that CNSP consistently improves over NSP$^2$. Our results highlight the importance of principled matrix-level formulations for building robust and interpretable prompt-based continual learning methods.
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 17835
Loading