Energy-Based Models for Continual LearningDownload PDF

Feb 26, 2021 (edited Apr 25, 2021)EBM_WS@ICLR2021 OralReaders: Everyone
  • Keywords: Continual learning, Energy-based model
  • TL;DR: We motivate Energy-Based Models (EBMs) as a promising model class for continual learning problems.
  • Abstract: We motivate Energy-Based Models (EBMs) as a promising model class for continual learning problems. Instead of tackling continual learning via the use of external memory, growing models, or regularization, EBMs have a natural way to support a dynamically-growing number of tasks or classes that causes less interference with previously learned information. Our proposed version of EBMs for continual learning is simple, efficient and outperforms baseline methods by a large margin on several benchmarks. Moreover, our proposed contrastive divergence based training objective can be applied to other continual learning methods, resulting in substantial boosts in their performance. We also show that EBMs are adaptable to a more general continual learning setting where the data distribution changes without the notion of explicitly delineated tasks. These observations point towards EBMs as a class of models naturally inclined towards the continual learning regime.
1 Reply

Loading