TL;DR: A novel method of modelling Knowledge Graphs based on Distance Embeddings and Neural Networks
Abstract: Over the past decade, knowledge graphs became popular for capturing structured domain knowledge.
Relational learning models enable the prediction of missing links inside knowledge graphs. More specifically, latent distance approaches model the relationships among entities via a distance between latent representations.
Translating embedding models (e.g., TransE) are among the most popular latent distance approaches which use one distance function to learn multiple relation patterns.
However, they are mostly inefficient in capturing symmetric relations since the representation vector norm for all the symmetric relations becomes equal to zero. They also lose information when learning relations with reflexive patterns since they become symmetric and transitive.
We propose the Multiple Distance Embedding model (MDE) that addresses these limitations and a framework which enables collaborative combinations of latent distance-based terms (MDE).
Our solution is based on two principles: 1) using limit-based loss instead of margin ranking loss and 2) by learning independent embedding vectors for each of terms we can collectively train and predict using contradicting distance terms.
We further demonstrate that MDE allows modeling relations with (anti)symmetry, inversion, and composition patterns. We propose MDE as a neural network model which allows us to map non-linear relations between the embedding vectors and the expected output of the score function.
Our empirical results show that MDE outperforms the state-of-the-art embedding models on several benchmark datasets.
Code: https://drive.google.com/open?id=1eE5KvWtg6IJDlBKW-D7vR7lURCQNLich
Keywords: Representation Learning, Knowledge Graph embedding, Neural Networks
Original Pdf: pdf
7 Replies
Loading