LPC: A Logits and Parameter Calibration Framework for Continual LearningDownload PDF

Anonymous

05 Jun 2022 (modified: 05 May 2023)ACL ARR 2022 June Blind SubmissionReaders: Everyone
Keywords: catastrophic forgetting, continual learning, lifelong learning, natural language processing
Abstract: When we execute the typical fine-tuning paradigm on continuously sequential tasks, the model will suffer from the catastrophic forgetting problem (i.e., they forget the parameters learned in previous tasks when training the model on newly emerged tasks). Existing replay-based methods need extra storage for old data to update the parameters of the previous classifier to overcome catastrophic forgetting. Our work aims to achieve the sequential/continual learning of knowledge without accessing the old data. The core idea is to calibrate the parameters and logits (output) so that preserving old parameters and generalized learning on new concepts can be solved simultaneously. Our proposed framework includes two major components, the Logits Calibration (LC) and Parameter Calibration (PC). The LC focuses on calibrating the learning of novel models with old models, and PC aims to preserve the parameters of old models. These two operations can maintain the old knowledge while learning new tasks without storing previous data. We do experiments on 9 scenarios of the GLUE (the General Language Understanding Evaluation) benchmark. The experimental results show that our model achieves state-of-the-art performance on all scenarios.
Paper Type: long
Editor Reassignment: yes
Reviewer Reassignment: yes
0 Replies

Loading