Combining Variational Continual Learning with FiLM LayersDownload PDF

12 Jun 2020 (modified: 05 May 2023)LifelongML@ICML2020Readers: Everyone
Student First Author: Yes
Abstract: The standard architecture for continual learning is a multi-headed neural network, which has shared body parameters and task-specific heads. Features for each task are generated in the same way. This could be too restrictive, particularly when tasks are very distinct. We propose combining FiLM layers, a flexible way to enable task-specific feature modulation in CNNs, with an existing algorithm, Variational Continual Learning (VCL). We show that this addition consistently improves performance, particularly when tasks are more varied. Furthermore, we demonstrate how FiLM Layers can mitigate VCL's tendency to over-prune and help it use more model capacity. Finally, we find that FiLM Layers perform feature modulation as opposed to gating, making them more flexible than binary mask based approaches.
TL;DR: This paper adds per-task feature modulation to Variational Continual Learning, improving performance and mitigating overpruning
0 Replies

Loading