Recasting Continual Learning as Sequence Modeling

Published: 21 Sept 2023, Last Modified: 14 Jan 2024NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: meta-continual learning, sequence modeling, Transformers, efficient Transformers
TL;DR: We formulate continual learning as a sequence modeling problem, allowing advanced sequence models (e.g., Transformers) to be adopted as general meta-continual learning methods.
Abstract: In this work, we aim to establish a strong connection between two significant bodies of machine learning research: continual learning and sequence modeling. That is, we propose to formulate continual learning as a sequence modeling problem, allowing advanced sequence models to be utilized for continual learning. Under this formulation, the continual learning process becomes the forward pass of a sequence model. By adopting the meta-continual learning (MCL) framework, we can train the sequence model at the meta-level, on multiple continual learning episodes. As a specific example of our new formulation, we demonstrate the application of Transformers and their efficient variants as MCL methods. Our experiments on seven benchmarks, covering both classification and regression, show that sequence models can be an attractive solution for general MCL.
Supplementary Material: zip
Submission Number: 4212
Loading