Energy-Based Models for Continual LearningDownload PDF

Published: 25 Apr 2021, Last Modified: 22 Oct 2023EBM_WS@ICLR2021 OralReaders: Everyone
Keywords: Continual learning, Energy-based model
TL;DR: We motivate Energy-Based Models (EBMs) as a promising model class for continual learning problems.
Abstract: We motivate Energy-Based Models (EBMs) as a promising model class for continual learning problems. Instead of tackling continual learning via the use of external memory, growing models, or regularization, EBMs have a natural way to support a dynamically-growing number of tasks or classes that causes less interference with previously learned information. Our proposed version of EBMs for continual learning is simple, efficient and outperforms baseline methods by a large margin on several benchmarks. Moreover, our proposed contrastive divergence based training objective can be applied to other continual learning methods, resulting in substantial boosts in their performance. We also show that EBMs are adaptable to a more general continual learning setting where the data distribution changes without the notion of explicitly delineated tasks. These observations point towards EBMs as a class of models naturally inclined towards the continual learning regime.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 2 code implementations](https://www.catalyzex.com/paper/arxiv:2011.12216/code)
1 Reply

Loading