everyone
since 04 Oct 2024">EveryoneRevisionsBibTeXCC BY 4.0
Exemplar-free Class Incremental Learning requires the learning agent to incrementally acquire new class information and maintain past knowledge without having access to samples from previous tasks. Despite the significant performance achieved by the subspace ensemble of a mixture of experts (MoE) with Gaussian prototypical networks, a critical gap still exists. As the downstream tasks arrive, the subspace representation of old classes gets updated, resulting in a prototype drift and leading to forgetting. To address the forgetting problem, we propose ProCEED to dynamically realign previous classes' representation in the latest subspace to adjust the drifted class prototypes and preserve their decision boundaries. Specifically, we compute the inter-subspace angular drifts of the prototype of previous incremental stages with the current one, holding the local semantic relationship between the incremental subspaces. The angular drift is then used to adjust old tasks' prototypes into the subspace of incremental tasks. Furthermore, the model inherits combined knowledge from MoE, supporting plasticity without extra computational burden. Consequently, ProCEED significantly balances the stability-plasticity dilemma over incoming incremental tasks, allowing the model to learn continually. The experimental evaluations on challenging benchmark datasets demonstrate dominant accuracy for ProCEED compared to the state-of-the-art class-incremental learning methods.