RAPL: A Relation-Aware Prototype Learning Approach for Few-Shot Document-Level Relation Extraction

Published: 07 Oct 2023, Last Modified: 01 Dec 2023EMNLP 2023 MainEveryoneRevisionsBibTeX
Submission Type: Regular Long Paper
Submission Track: Information Extraction
Submission Track 2: Machine Learning for NLP
Keywords: Document-Level Relation Extraction, Few-Shot Learning, Metric-Based Meta-Learning, Relation-Aware Prototype Learning
TL;DR: We propose a relation-aware prototype learning method to strengthen the relational semantics of prototype representations for few-shot document-level relation extraction, outperforming the state-of-the-art approaches by a large margin.
Abstract: How to identify semantic relations among entities in a document when only a few labeled documents are available? Few-shot document-level relation extraction (FSDLRE) is crucial for addressing the pervasive data scarcity problem in real-world scenarios. Metric-based meta-learning is an effective framework widely adopted for FSDLRE, which constructs class prototypes for classification. However, existing works often struggle to obtain class prototypes with accurate relational semantics: 1) To build prototype for a target relation type, they aggregate the representations of all entity pairs holding that relation, while these entity pairs may also hold other relations, thus disturbing the prototype. 2) They use a set of generic NOTA (none-of-the-above) prototypes across all tasks, neglecting that the NOTA semantics differs in tasks with different target relation types. In this paper, we propose a relation-aware prototype learning method for FSDLRE to strengthen the relational semantics of prototype representations. By judiciously leveraging the relation descriptions and realistic NOTA instances as guidance, our method effectively refines the relation prototypes and generates task-specific NOTA prototypes. Extensive experiments demonstrate that our method outperforms state-of-the-art approaches by average 2.61\% $F_1$ across various settings of two FSDLRE benchmarks.
Submission Number: 3306
Loading