Abstract: Continual learning aims to progressively acquire new knowledge while retaining previously learned information, addressing the challenge of catastrophic forgetting. This paper introduces a novel continual learning method, called SPrint, which is the first research effort to devise the principles of self-paced learning for solving continual learning problems. SPrint dynamically adapts the complexity of samples for both new and previous tasks in response to the model’s current learning capacity. It employs a self-paced loss function for sampling new tasks and a forgetting occurrence for sampling previous tasks with replay memory. Through extensive empirical evaluation, we demonstrate that SPrint consistently outperforms state-of-the-art methods in various continual learning benchmarks. Our source code is publicly available at https://github.com/bigbases/SPrint.
Loading