Pre-Trained Embeddings for Enhancing Multi-Hop ReasoningDownload PDF

Published: 16 Jun 2023, Last Modified: 16 Jun 2023IJCAI 2023 Workshop KBCG OralReaders: Everyone
Keywords: Link Prediction, XAI, Knowledge Graphs
Abstract: Knowledge graphs are an efficient way to represent heterogeneous data from multiple sources or disciplines by utilizing nodes and their relations. Nevertheless, they are frequently incom- plete in terms of the subject they represent. Link prediction methods are used to discover additional links (or even to create new ones) between entities present in the Knowledge Graph (KG). In order to achieve this, multi-hop reasoning models have demonstrated good predictive performance and the ability to generate interpretable decisions, thereby enabling their application in high-stakes domains such as finance and public health. A multi-hop reasoning model usually has two tasks: 1) construct an accurate representation of the entities and relationships of the KG; 2) use these representations to explore the reasoning paths in the KG that support the newly predicted links. In this paper, we investi- gate how the performance of a multi-hop reasoning model changes when using pre-trained embeddings for the KG’s nodes and relations. The experiments conducted on three benchmark datasets, respec- tively WN18RR, NELL-995 and FB15K-237, suggest that using pre-trained embeddings improves: (i) the predictive performance of multi-hop reasoning models for all three datasets, (ii) the number of newly predicted links, and (iii) the quality of paths used as explanations.
0 Replies

Loading