Enhancing Continual Relation Extraction with Concept Aware Dynamic Memory Optimization

Published: 01 Jan 2024, Last Modified: 13 Nov 2024APWeb/WAIM (1) 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Continual relation extraction (CRE) aims to assimilate constantly emerging new relations while avoiding forgetting previously learned relations. Existing works often rely on storing and replaying a fixed set of typical samples to prevent catastrophic forgetting. However, repeatedly replaying these samples may cause the biased latent features problem. In this paper, we find that the representations of memory samples will gradually lose representativeness and diversity in the process of repeated replay. This representation bias will seriously affect the performance of the CRE model. To address this challenge, we propose a novel CRE framework based on dynamic memory. Specifically, we propose Large Language Model (LLM) based concept-aware dynamic memory optimization and optimized relation prototype to mitigate the effects of biased representations of memory samples. The former provides more appropriate training samples for replay training and the latter generates more accurate relation prototypes for the prediction. Our experimental results demonstrate the effectiveness of our method in mitigating biased feature representations to overcome catastrophic forgetting.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview