Attending to SPARQL Logs for Knowledge Representation LearningOpen Website

Published: 01 Jan 2022, Last Modified: 14 May 2023KSEM (1) 2022Readers: Everyone
Abstract: Knowledge Representation Learning (KRL) maps entities and relations to a continuous low-dimensional vector space, alleviating data sparsity of Knowledge Graphs (KGs) and improving computing efficiency. However, most prior KRL models represented by TransE embed all triple facts equally, without distinguishing latent semantic information of relations. In this paper, (1) we propose a novel correlation-aware Knowledge Representation Learning framework, which integrates semantic features of relations while embedding; (2) we capture the semantics by mining time-sequence and frequency characteristics of relations from historical SPARQL logs; (3) we design a weighted encoder to distinguish correlation of each triple according to relational semantics, and introduce the triple correlation into translation-based models to enhance entity representations. Experimental results on Wikidata datasets show that our proposed model significantly outperformed the state-of-the-art translation-based models on both knowledge graph completion tasks and triple classification tasks.
0 Replies

Loading