Injecting Background Knowledge into Embedding Models for Predictive Tasks on Knowledge GraphsDownload PDF

Published: 23 Feb 2021, Last Modified: 05 May 2023ESWC 2021 ResearchReaders: Everyone
Keywords: Knowledge graphs, Embedding Models, Representation learning, Background Knowledge, Link prediction, Triple classification
Abstract: Embedding models have been successfully exploited for Knowledge Graph refinement. In these models, the data graph is projected into a lowdimensional space, in which graph structural information are preserved as much as possible, enabling an efficient computation of solutions. We propose a solution for injecting available background knowledge (schema axioms) to further improve the quality of the embeddings. The method has been applied to enhance existing models to produce embeddings that can encode knowledge that is not merely observed but rather derived by reasoning on the available axioms. An experimental evaluation on link prediction and triple classification tasks proves the improvement yielded implementing the proposed method over the original ones.
Subtrack: Machine Learning
First Author Is Student: No
14 Replies