Stop Before You Forget: NTK-Guided Early Stopping for Continual Learning

ICLR 2026 Conference Submission16997 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Continual Learning, Catastrophic Forgetting
TL;DR: We prevent catastrophic forgetting in few-shot continual learning by predicting gradient interference via FAR, enabling efficient and knowledge-preserving adaptation of LLMs.
Abstract: Few-shot continual learning poses a fundamental challenge in acquiring new knowledge from minimal data while preserving previously learned capabilities. Existing parameter-efficient fine-tuning methods such as LoRA, while computationally efficient, often suffer from catastrophic forgetting even with small parameter updates. Current mitigation strategies are mostly reactive, attempting to recover knowledge after interference has occurred, which is often ineffective in data-scarce scenarios. We propose a proactive prevention framework grounded in Neural Tangent Kernel (NTK) theory. Our central idea is that gradient interference can be predicted before it causes irreversible forgetting. To this end, we introduce the Forgetting-Acquisition Ratio (FAR), a metric that quantifies conflicts between new gradient updates and existing knowledge subspaces in real-time. FAR enables principled early stopping just before forgetting emerges, supported by an adaptive threshold that automatically adjusts the protection strength based on task similarity. Our approach integrates seamlessly with parameter-efficient methods such as linearized LoRA, adding minimal computational overhead and no inference cost. Theoretical analysis provides formal guarantees on forgetting, and empirical studies confirm that proactive prevention fundamentally outperforms reactive strategies. This work lays the foundation for continual learning in the era of large language models, where adaptation must be both data-efficient and knowledge-preserving.
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 16997
Loading