A Transformer-based Knowledge Graph Embedding Model Combining Graph Paths and Local Neighborhood

Published: 01 Jan 2024, Last Modified: 21 Jul 2025IJCNN 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Many existing knowledge graph embedding methods achieve outstanding performance by exploiting the graph structure, among which graph neural network-based methods that utilize the local neighborhood are the most representative. However, the shallow network structure of graph neural networks limits the model’s expressiveness, and the existing problem of over-smoothing also prevents the model from capturing long-distance information. To address these issues, we propose a Transformer-based knowledge graph embedding method that combines graph paths and local neighborhood (TKGE-PN). First, it samples multiple graph paths by using a biased random walk algorithm starting from the central entity. Then these sampled paths are transformed into vector representations by a Transformer-based graph path encoding module. Finally, the local neighborhood encoding module aggregates all graph path vector representations to score triples. During the graph path encoding process, a masked entity relation prediction task is used to enhance the model’s ability to learn long-distance information. Experimental results show that on two standard datasets, FB15k-237 and WN18RR, the performance of TKGE-PN surpasses that of most existing models, demonstrating the effectiveness of our approach.
Loading