Dynamic Prefix as Instructor for Incremental Named Entity Recognition: A Unified Seq2Seq Generation Framework
Abstract: The Incremental Named Entity Recognition (INER) task aims to update a model to extract entities from an expanding set of entity type candidates due to concerns related to data privacy and scarcity. However, conventional incremental learning methods for INER often suffer from the catastrophic forgetting problem, which leads to the degradation of the model's performance on previously encountered entity types. In this paper, we propose a parameter-efficient dynamic prefix method and formalize INER as a unified seq2seq generation task. By employing the dynamic prefix as a task instructor to guide the generative model, our approach can preserve task-invariant knowledge while adapting to new entities with minimal parameter updates, making it particularly effective in low-resource scenarios. Additionally, we design a generative label augmentation strategy and a novel self-entropy loss to balance the stability and plasticity of the model. Empirical experiments on NER benchmarks demonstrate the effectiveness of our proposed method in addressing the challenges associated with INER.
Paper Type: Long
Research Area: Efficient/Low-Resource Methods for NLP
Research Area Keywords: parameter-efficient-training; NLP in resource-constrained settings
Contribution Types: NLP engineering experiment, Approaches to low-resource settings
Languages Studied: English
Submission Number: 5432
Loading