Contrastive Learning of Multivariate Gaussian Distributions of Incremental Classes for Continual Learning

Published: 01 Jan 2024, Last Modified: 05 Nov 2025IWINAC 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Recent advancements in deep learning algorithms have shown remarkable performance on trained tasks, yet they struggle with “catastrophic forgetting” when faced with new tasks, highlighting the need for Continual Learning (CL) methods that update models efficiently without losing prior knowledge. CL models, constrained by limited visibility of the dataset for each task, develop a significant dependency on past tasks, complicating the integration of new information and maintaining robustness against future tasks. This paper proposes a novel CL method that leverages contrastive learning to secure a latent space for future data representation, reducing the dependency on past tasks and enhancing model adaptability. By distinguishing class spaces in the latent domain and re-representing these as sets of means and variances, our method effectively preserves past knowledge while ensuring future robustness. Experimental results show our method surpasses existing CL methods by a significant margin, proving its efficacy in handling information across past, present, and future tasks, thus establishing a robust solution for the challenges of catastrophic forgetting and task dependency in CL.
Loading