Continual Learning with Gated Incremental Memories for Sequential Data ProcessingDownload PDF

25 Sep 2019 (modified: 24 Dec 2019)ICLR 2020 Conference Blind SubmissionReaders: Everyone
  • Original Pdf: pdf
  • TL;DR: We tackled the problem of CL in sequential data processing scenarios, providing a set of domain-agnostic benchmarks against which we compared performances of a novel RNN for CL and other standard RNNs.
  • Abstract: The ability to learn over changing task distributions without forgetting previous knowledge, also known as continual learning, is a key enabler for scalable and trustworthy deployments of adaptive solutions. While the importance of continual learning is largely acknowledged in machine vision and reinforcement learning problems, this is mostly under-documented for sequence processing tasks. This work focuses on characterizing and quantitatively assessing the impact of catastrophic forgetting and task interference when dealing with sequential data in recurrent neural networks. We also introduce a general architecture, named Gated Incremental Memory, for augmenting recurrent models with continual learning skills, whose effectiveness is demonstrated through the benchmarks introduced in this paper.
  • Code:
  • Keywords: continual learning, recurrent neural networks, progressive networks, gating autoencoders, sequential data processing
7 Replies