A Joint Training Framework for Open-World Knowledge Graph EmbeddingsDownload PDF

Published: 31 Aug 2021, Last Modified: 05 May 2023AKBC 2021Readers: Everyone
Keywords: Knowledge Graph, Knowledge Graph completion, open-world embeddings
TL;DR: An efficient framework for learning open-world Knowledge graph embedding.
Abstract: Knowledge Graphs(KGs) represent factual information as graphs of entities connected by relations. Knowledge graph embeddings have emerged as a popular approach to encode this information for various downstream tasks like natural language inference, question answering and dialogue generation. As knowledge bases expand, we are presented with newer (open-world) entities, often with textual descriptions. We require techniques to embed new entities as they arrive using the textual information at hand. This task of open-world KG completion has received some attention in recent years. However, we find that existing approaches suffer from one or more of four drawbacks – 1) They are not modular with respect to the choice of the KG embedding model 2) They ignore best practices for aligning two embedding spaces 3) They do not account for differences in training strategy needed when presented with datasets with different description sizes and 4) They do not produce entity embeddings for use by downstream tasks. To address these problems, we propose FOlK (Framework for Open-World KG embeddings) - a technique that jointly learns embeddings for KG entities from descriptions and KG structure for open-world knowledge graph completion. Additionally, we modify existing data sources and make available YAGO3-10- Open and WN18RR-Open two datasets that are well suited for demonstrating the efficacy of open-world KG completion approaches. Finally, we empirically demonstrate the effectiveness of our model in improving upon state-of-the-art baselines on several tasks resulting in performance increases of up to 72% on mean reciprocal rank.
Subject Areas: Knowledge Representation, Semantic Web and Search
Archival Status: Archival
8 Replies

Loading