On the Use of Entity Embeddings from Pre-Trained Language Models for Knowledge Graph CompletionDownload PDF

Anonymous

16 Nov 2021 (modified: 05 May 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: Recent work has found that entity representations can be extracted from pre-trained language models to develop knowledge graph completion models that are more robust to the naturally occurring sparsity found in knowledge graphs. In this work, we explore how to best extract and incorporate those embeddings. We explore the suitability of the extracted embeddings for direct use in entity ranking and introduce both unsupervised and supervised processing methods that can lead to improved downstream performance. We then introduce supervised embedding extraction methods and demonstrate that we can extract more informative representations. We also examine the effect of language model selection and find that the choice of model can have a significant impact. We then synthesize our findings and develop a knowledge graph completion model that significantly outperforms recent neural models.
0 Replies

Loading