Pseudo-Rehearsal for Continual Learning with Normalizing FlowsDownload PDF

12 Jun 2020 (modified: 29 Sept 2024)LifelongML@ICML2020Readers: Everyone
Student First Author: Yes
Keywords: Machine Learning, Continuous Learning, Normalizing Flow, catastrophic Forgetting
Abstract: Catastrophic forgetting (CF) happens whenever a neural network overwrites past knowledge while being trained on new tasks. Common techniques to handle CF include regularization of the weights (using, e.g., their importance on past tasks), and rehearsal strategies, where the network is constantly re-trained on past data. Generative models have also been applied for the latter, in order to have endless sources of data. In this paper, we propose a novel method that combines the strengths of regularization and generative-based rehearsal approaches. Our generative model consists of a normalizing flow (NF), a probabilistic and invertible neural network, trained on the internal embeddings of the network. By keeping a single NF conditioned on the task, we show that our memory overhead remains constant. In addition, exploiting the invertibility of the NF, we propose a simple approach to regularize the network's embeddings with respect to past tasks. We show that our method performs favorably with respect to state-of-the-art approaches in the literature, with bounded computational power and memory overheads.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/pseudo-rehearsal-for-continual-learning-with/code)
0 Replies

Loading