Unsupervised Continual Learning and Self-Taught Associative Memory Hierarchies

James Smith, Seth Baer, Zsolt Kira, Constantine Dovrolis

Mar 24, 2019 ICLR 2019 Workshop LLD Blind Submission readers: everyone
  • Keywords: continual learning, unsupervised learning, online learning
  • TL;DR: We introduce unsupervised continual learning (UCL) and a neuro-inspired architecture that solves the UCL problem.
  • Abstract: We first pose the Unsupervised Continual Learning (UCL) problem: learning salient representations from a non-stationary stream of unlabeled data in which the number of object classes varies with time. Given limited labeled data just before inference, those representations can also be associated with specific object types to perform classification. To solve the UCL problem, we propose an architecture that involves a single module, called Self-Taught Associative Memory (STAM), which loosely models the function of a cortical column in the mammalian brain. Hierarchies of STAM modules learn based on a combination of Hebbian learning, online clustering, detection of novel patterns and forgetting outliers, and top-down predictions. We illustrate the operation of STAMs in the context of learning handwritten digits in a continual manner with only 3-12 labeled examples per class. STAMs suggest a promising direction to solve the UCL problem without catastrophic forgetting.
0 Replies

Loading