Scaling characteristics of sequential multitask learning: Networks naturally learn to learnDownload PDF

Published: 04 Jun 2019, Last Modified: 05 May 2023ICML Deep Phenomena 2019Readers: Everyone
Keywords: catastrophic forgetting, backward interference, forward facilitation, multitask learning, lifelong learning, metalearning
Abstract: We explore the behavior of a standard convolutional neural net in a setting that introduces classification tasks sequentially and requires the net to master new tasks while preserving mastery of previously learned tasks. This setting corresponds to that which human learners face as they acquire domain expertise, for example, as an individual reads a textbook chapter-by-chapter. Through simulations involving sequences of 10 related tasks, we find reason for optimism that nets will scale well as they advance from having a single skill to becoming domain experts. We observed two key phenomena. First, forward facilitation---the accelerated learning of task n+1 having learned n previous tasks---grows with n. Second, backward interference---the forgetting of the n previous tasks when learning task n+1---diminishes with n. Forward facilitation is the goal of research on metalearning, and reduced backward interference is the goal of research on ameliorating catastrophic forgetting. We find that both of these goals are attained simply through broader exposure to a domain.
TL;DR: We study the behavior of a CNN as it masters new tasks while preserving mastery for previously learned tasks
1 Reply

Loading