Dynamically Constructing Relation Extraction Network for Continual Learning

ACL ARR 2025 February Submission8383 Authors

16 Feb 2025 (modified: 09 May 2025)ACL ARR 2025 February SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Continual relation extraction aims to continuously learn new relation categories without forgetting the already learned ones. To achieve this goal, two key issues need to be addressed: catastrophic forgetting (CF) of the model and knowledge transfer (KT) of the relations. In terms of CF, there has been a great deal of research work. However, another important challenge of continual learning: knowledge transfer, has hardly been studied in the field of relation extraction. To address this, we propose dynamically constructing relation extraction networks (DCREN) for Continual relation extraction, which dynamically changes the architecture of the model through six designed actions to achieve knowledge transfer of similar relations, and further to combat catastrophic forgetting, an extensible classification module is proposed to expand the new learning space for new tasks while preserving the knowledge of old relations. Experiments show that DCREN achieves state-of-the-art performance through dynamically updating the model structure to learn new relations and transfer old knowledge.
Paper Type: Long
Research Area: Information Extraction
Research Area Keywords: relation extraction; continual learning
Contribution Types: Theory
Languages Studied: English
Submission Number: 8383
Loading