RGembed: A Knowledge Graph Embedding Model Integrating Dual-Prediction andGraph Attention Networks

Published: 2024, Last Modified: 05 Jun 2025AIoTC 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Knowledge graph embeddings play a vital role in subsequent tasks such as link prediction, entity alignment, and question answering systems. However, current techniques frequently encounter difficulties in capturing intricate semantic connections. Introducing RGembed, a model specifically developed to better the representation of embeddings and increase the performance of models. Firstly, it utilizes a technique of reversing the training data, allowing for bidirectional prediction between head and tail entities inside a knowledge graph. This strategy employs a shared loss function, which enables the model to effectively capture a wider range of semantic correlations throughout the training process. Furthermore, the model incorporates a Graph Attention Network layer during the entity embedding stage. This layer improves entity representations by combining information from neighboring entities. An extensive series of comparative studies were undertaken to verify the efficacy of the RGembed model. The mean reciprocal rank (MRR) of RGembed increased by 7.1%, 7.3%, 1.9%, 3.4%, and 6.4%, respectively, compared to the baseline model. Furthermore, the ablation experiments demonstrated that eliminating the bidirectional prediction and GAT layer resulted in a notable decrease in the model's ability to embed information. This further validates the crucial significance of these components in the performance of the model.
Loading