Tackling Non-forgetting and Forward Transfer with a Unified Lifelong Learning ApproachDownload PDF

Jun 12, 2020 (edited Aug 02, 2020)ICML 2020 Workshop LifelongML Blind SubmissionReaders: Everyone
  • Student First Author: Yes
  • TL;DR: A new approach to tackle more lifelong learning properties
  • Keywords: Lifelong Learning, Catestrohphic Forgetting, Forward Transfer, Backward Transfer
  • Abstract: Humans are the best example of agents that can learn a variety of skills incrementally over the course of their lives, and imbuing machines with this skill is the goal of lifelong machine learning. Ideally, lifelong learning should achieve non-forgetting, forward and backward transfer, avoid confusion, support few-shot learning, and so on. In previous approaches, the focus has been given to subsets of these properties, often by fitting together with an array of separate mechanisms. In this work, we propose a simple yet powerful unified framework that supports almost all of these properties through {\em one} central consolidation mechanism. We then describe a particular instance of this framework designed to support non-forgetting and forward transfer. This novel approach works by efficiently locating sparse neural sub-networks and controlling their consolidation during lifelong learning.
  • Previously Published: Xinyu Yun
0 Replies