Continual Learning via Neural PruningDownload PDF

Published: 02 Oct 2019, Last Modified: 21 Apr 2024Real Neurons & Hidden Units @ NeurIPS 2019 PosterReaders: Everyone
Keywords: life-long learning, catastrophic forgetting
TL;DR: We use simple and biologically motivated modifications of standard learning techniques to achieve state of the art performance on catastrophic forgetting benchmarks.
Abstract: Inspired by the modularity and the life-cycle of biological neurons,we introduce Continual Learning via Neural Pruning (CLNP), a new method aimed at lifelong learning in fixed capacity models based on the pruning of neurons of low activity. In this method, an L1 regulator is used to promote the presence of neurons of zero or low activity whose connections to previously active neurons is permanently severed at the end of training. Subsequent tasks are trained using these pruned neurons after reinitialization and cause zero deterioration to the performance of previous tasks. We show empirically that this biologically inspired method leads to state of the art results beating or matching current methods of higher computational complexity.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:1903.04476/code)
4 Replies

Loading