Abstract: Continual learning (CL) refers to the ability to continuously learn and accumulate new
knowledge while retaining useful information from past experiences. Although numerous
CL methods have been proposed in recent years, it is not straightforward to deploy them
directly to real-world decision-making problems due to their computational cost and lack of
uncertainty quantification. To address these issues, we propose CL-BRUNO, a probabilistic,
Neural Process-based CL model that performs scalable and tractable Bayesian update
and prediction. Our proposed approach uses deep-generative models to create a unified
probabilistic framework capable of handling different types of CL problems such as task-
and class-incremental learning, allowing users to integrate information across different CL
scenarios using a single model. Our approach is able to prevent catastrophic forgetting
through distributional and functional regularisation without the need of retaining any
previously seen samples, making it appealing to applications where data privacy or storage
capacity is of concern. Experiments show that CL-BRUNO outperforms existing methods
on both natural image and biomedical data sets, confirming its effectiveness in real-world
applications.
Submission Length: Long submission (more than 12 pages of main content)
Assigned Action Editor: ~Magda_Gregorova2
Submission Number: 4714
Loading