Keywords: continual learning, class-incremental learing, examplar-free
TL;DR: We propose a prototype decoding and re-encoding mechanism which can be seamlessly integrated with most prototype-based CIL methods, to alleviate prototype expiration.
Abstract: Prototype-based Class-Incremental Learning (CIL) methods achieve competitive performance to mitigate catastrophic forgetting, requiring only the storage of prototypes in memory instead of retaining past task exemplars.
However, as the encoder updates during CIL sessions, prototypes computed in earlier sessions by the old encoder can become outdated, i.e., mismatching the accordingly drifted ground-truth prototypes of past task data.
In this paper, we propose a prototype update mechanism, termed Prototype Decoding and Re-encoding (PDR). We found that by combining the knowledge of stored prototypes and the latest frozen encoder, we can obtain a high-quality approximation of the sample distribution of past data, and use this extra information to guide the prototype update.
Our prototype update mechanism is plug-and-play and can be seamlessly integrated with most prototype-based CIL methods.
A computationally efficient three-step PDR stage is added after the encoder is trained by a prototype-based CIL method.
First, we use the old encoder and stored prototypes to guide the decoder training. Then, an adapter is trained by the pseudo exemplars generated by the decoder. Finally, we use the adapter to re-encode the stored prototypes.
Extensive experiments show that a simple CIL method LDC combined with PDR outperforms current exemplar-free baselines by up to 3.86\%.
Moreover, the inclusion of the PDR mechanism results in no additional time overhead per session, relative to the already time-efficient LDC baseline.
Supplementary Material: zip
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 8322
Loading