MAML-CL: Edited Model-Agnostic Meta-Learning for Continual LearningDownload PDF

Anonymous

16 Nov 2021 (modified: 05 May 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: Continual learning (CL) exhibits a learning ability to well-learn all sequentially seen tasks drawn from various domains. Yet, existing sequential training methods fail to consolidate learned knowledge from earlier tasks due to data distribution shifts, hereby leading to catastrophic forgetting. We devise an optimization-based meta learning framework for CL in accordance with MAML, where query samples are edited for generalization of learned knowledge. We conduct extensive experiments on text classification in a low resource CL setup, where we downsize training set to its 10%. The experimental results demonstrate the superiority of our method in terms of stability, fast adaptation, memory efficiency and knowledge retention across various domains.
0 Replies

Loading