From Discrimination to Generation: Knowledge Graph Completion with Generative TransformerDownload PDF

Published: 28 Apr 2022, Last Modified: 22 Oct 2023DLG4NLP 2022 PosterReaders: Everyone
Keywords: Knowledge Graph Completion, Generation, Transformer
TL;DR: A seq2seq-based knowledge graph completion approach with transformer.
Abstract: Knowledge graph completion aims to address the problem of extending a KG with missing triples. In this paper, we provide an approach GenKGC, which converts knowledge graph completion to sequence-to-sequence generation task with the pre-trained language model. We further introduce relation-guided demonstration and entity-aware hierarchical decoding for better representation learning and fast inference. Experimental results on three datasets show that our approach can obtain better or comparable performance than baselines and achieve faster inference speed compared with previous methods with pre-trained language models. We also release a new large-scale Chinese knowledge graph dataset OpenBG500 for research purpose. Code and datasets are available in https://github.com/zjunlp/PromptKG/tree/main/research/GenKGC.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2202.02113/code)
0 Replies

Loading