Keywords: Continual learning, prompts, dynamic management
Abstract: Prompt-based continual learning methods have emerged to address catastrophic forgetting by leveraging large-scale foundation models. These methods keep pretrained models frozen and tune only small sets of parameters called prompts to learn tasks sequentially. However, when a new task comes in, the key-query matching mechanism in prompt-based methods selects the most relevant prompt without adequately considering whether it is actually suitable for learning the task. To address this, we propose the CoEn (Continual Enhanced prompt pool), which dynamically manages the prompt pool each time a new task is introduced. Our goal is to transform the static management of the prompt pool into a dynamic approach, enabling greater flexibility in adapting to new tasks and reducing the risk of catastrophic forgetting. Specifically, CoEn includes a new self-enhancement mechanism that assesses whether the prompts in the prompt pool can positively transfer knowledge to a new task and selectively strengthens the prompts. We demonstrate the proposed method under image classification benchmarks for class-incremental learning. Experimental results show that the proposed method outperforms existing prompt-based methods with an average margin of 3.8% across all scenarios.
Submission Number: 27
Loading