Simultaneous Extraction of Entities and Relations Based on Pre-trained Language Model and Pre-defined Language Templates

Abstract: With the advent of the era of big data, the need to obtain core information from text data is getting stronger and stronger. And the knowledge graph can visually represent the core information of the text. Entity relation extraction is a challenge and key part of knowledge graph construction. In this paper, we propose a model that can effectively perform entity relation extraction. The model adopts a joint training approach based on parameter sharing to solve the error propagation problem in pipelined extraction. At the same time, the model uses a pre-trained language model as the basis to solve the problem of lack of semantic knowledge in traditional models. Further, it uses a linguistic template-based approach to bridge the discrepancies in training and fine-tuning of pre-trained language models. To validate the proposed approach, we conduct comparative experiments on the SemEval2010 dataset and conll04 dataset. The validation results demonstrate that our model can improve the accuracy, recall, and F1 score of joint entity relationship extraction compared to the baseline models.
0 Replies
Loading