Continual Learning with Gated Incremental Memories for Sequential Data ProcessingDownload PDF

25 Sept 2019 (modified: 22 Oct 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
TL;DR: We tackled the problem of CL in sequential data processing scenarios, providing a set of domain-agnostic benchmarks against which we compared performances of a novel RNN for CL and other standard RNNs.
Abstract: The ability to learn over changing task distributions without forgetting previous knowledge, also known as continual learning, is a key enabler for scalable and trustworthy deployments of adaptive solutions. While the importance of continual learning is largely acknowledged in machine vision and reinforcement learning problems, this is mostly under-documented for sequence processing tasks. This work focuses on characterizing and quantitatively assessing the impact of catastrophic forgetting and task interference when dealing with sequential data in recurrent neural networks. We also introduce a general architecture, named Gated Incremental Memory, for augmenting recurrent models with continual learning skills, whose effectiveness is demonstrated through the benchmarks introduced in this paper.
Code: https://drive.google.com/open?id=1L2_y35Zy5xahmxQRqGDCGonPiHeyuJlp
Keywords: continual learning, recurrent neural networks, progressive networks, gating autoencoders, sequential data processing
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2004.04077/code)
Original Pdf: pdf
7 Replies

Loading