Enhancing Zero-Shot Relation Triplet Extraction through Staged Interaction with Large Language Models

ACL ARR 2024 June Submission459 Authors

11 Jun 2024 (modified: 12 Aug 2024)ACL ARR 2024 June SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Zero-shot Relation Triplet Extraction (ZeroRTE) is a challenging yet valuable task that extracts relation triplets from unstructured texts for new relation types, significantly reducing the time and effort needed for data labeling. With the enhancement of the zero-shot capability of large language model, the performance of many zero-shot tasks has been further improved only via chatting with Large Language Model(LLM). In this work, we transform the zero-shot triplet extraction task into a two-stage chat with LLM. Specially, followed by the step of triplet extraction, we prompt the LLM to perform the NER(Name Entity Recognition) task in the first stage. Then, in the second stage, we prompt the LLM to perform the relation classification task combining the result of the first stage. To overcome the impact of redundant information of the LLM's output on task evaluation, we design a Post-Processing module to obtain the relation triplet. Experiments on Wiki-ZSL and FewRel datasets show the efficacy of Relation Prompt for the ZeroRTE task. Remarkably, our method outperforms strong baselines by a significant margin, achieving an impressive 15.89\% increase in F1 scores, particularly when dealing with Wiki-ZSL with 15 unseen relations.
Paper Type: Long
Research Area: Information Extraction
Research Area Keywords: Zero-shot extraction,prompting
Contribution Types: Approaches to low-resource settings
Languages Studied: English
Submission Number: 459
Loading