Abstract: Relation extraction is an important task in natural language processing, which aims to extract the semantic relationships between entities from unstructured text. In traditional relation extraction methods, semantic, contextual and deep representation extraction are not sufficient. In this paper, we propose a relation extraction model (RoBBS, RoBERTa + Bi-GRU + Self Attention) based on pre-training and bi-directional semantic union. Firstly, the RoBERTa pre-training model is used to extracting the contextual features of distant sentences. Then Bi-GRU is leveraged to realize bidirectional semantic union, comprehensively extracting bidirectional semantic information. Combining this network with the attention mechanism allows for assigning greater weights to the semantic information that has a more significant role in determining the relationship classes of sentences. This, in turn, makes feature selection more efficient. Finally, the relationship is classified using the Softmax function. The experimental results show that the model achieves F-score of 89.02% on the SemEval2010 task8 dataset outperforming state-of-the-art models.
External IDs:dblp:journals/mlc/HeYHKR25
Loading