Contrastive Learning with Graph Context Modeling for Sparse Knowledge Graph CompletionDownload PDF

Anonymous

05 Jun 2022 (modified: 05 May 2023)ACL ARR 2022 June Blind SubmissionReaders: Everyone
Keywords: Knowledge Graph, Contrastive Learning, Graph Neural Network
Abstract: Knowledge Graph Embeddings (KGE) aim to map entities and relations to high dimensional spaces and have become the de-facto standard for knowledge graph completion. Most existing KGE methods suffer from the sparsity challenge, where it is harder to predict entities that appear less frequently in knowledge graphs. In this work, we propose a novel framework KRACL to alleviate the widespread sparsity in KGs with graph context and contrastive learning. Firstly, we propose the Knowledge Relational Attention Network (KRAT) to leverage the graph context by jointly aggregating neighbors and relations with the attention mechanism. KRAT is capable of capturing the subtle importance of different context triples and leveraging multi-hop information in knowledge graphs. Secondly, we propose the knowledge contrastive loss by combining the contrastive loss with cross entropy loss, which introduces more negative samples and thus enriches the feedback to sparse entities. Our experiments demonstrate that KRACL achieves superior results across various standard knowledge graph benchmarks, especially on WN18RR and NELL-995 which have many low in-degree entities. Extensive experiments also bear out KRACL's effectiveness of handling sparse knowledge graphs and robustness against noisy triples.
Paper Type: long
0 Replies

Loading