Rehearsal-Free Modular and Compositional Continual Learning for Language ModelsDownload PDF

Anonymous

16 Dec 2023ACL ARR 2023 December Blind SubmissionReaders: Everyone
TL;DR: We propose MoCL, a rehearsal-free modular and compositional continual learning framework for language models to tackle catastrophic forgetting and enhance knowledge transfer at the same time.
Abstract: Continual learning aims at incrementally acquiring new knowledge while not forgetting existing knowledge. To overcome catastrophic forgetting, methods are either rehearsal-based, i.e., store data examples from previous tasks for data replay, or isolate parameters dedicated to each task. However, rehearsal-based methods raise privacy and memory issues, and parameter-isolation continual learning does not consider interaction between tasks, thus hindering knowledge transfer. In this work, we propose MoCL, a rehearsal-free modular and compositional Continual learning framework which continually adds new modules to language models and composes them with existing modules. Experiments on various benchmarks show that MoCL outperforms state of the art, and effectively facilitates knowledge transfer.
Paper Type: short
Research Area: Machine Learning for NLP
Contribution Types: NLP engineering experiment
Languages Studied: English, Amharic, Algerian Arabic/Darja, Hausa, Igbo, Kinyarwanda, Moroccan Arabic/Darija, Mozambican Portuguese, Nigerian Pidgin, Oromo, Swahili, Tigrinya, Twi, Xitsonga, Yoruba
0 Replies

Loading