Shapley Neuron Values for Continual Learning: Which Neurons Matter Most?

Published: 29 Mar 2026, Last Modified: 08 May 2026OpenReview Archive Direct UploadEveryoneCC BY 4.0
Abstract: Continual learning enables neural networks to learn tasks sequentially without forgetting previously acquired knowledge. However, catastrophic forgetting, where performance on earlier tasks degrades sharply when learning new ones, remains a fundamental challenge. We address this problem with Shapley Neuron Valuation (SNV), a principled framework grounded in cooperative game theory that quantifies Neuron importance in continual learning. By selectively freezing important Neurons while keeping others plastic, SNV enables memory-free continual learning without architectural expansion. Extensive experiments show that SNV delivers substantial gains over memory-free baselines, achieving +19.50% accuracy on CIFAR-100 and +17.20% on TinyImageNet in the Class-IL setting. In Task-IL scenarios, SNV consistently surpasses existing memory-free approaches by large margins, reaching up to 9.08% higher accuracy on CIFAR-100 compared to the second-best memory-free method, while remaining competitive in comparison with memory-based methods.
Loading