Incorporating structural knowledge into language models for open knowledge graph completion

Published: 01 Jan 2025, Last Modified: 20 May 2025World Wide Web (WWW) 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Knowledge graphs have been proven to be valuable and practical in many AI-supported applications, such as e-commerce recommendations and legal consultation. Our collective knowledge keeps growing so that KGs are generally incomplete and new (unknown) knowledge that is not in KG is not false, i.e., the open-world assumption (OWA). Hence, there has been a surge of research interest in knowledge reasoning through KG Completion (KGC) techniques under OWA. Language models (LMs) have emerged as a promising alternative for direct learning of open corpora, which encouraged the use of LMs to broaden the scope of knowledge under OWA. However, retrieving important and relevant open knowledge from LMs to maintain an effective candidate space in terms of KGC is still a big challenge for existing models. In view of this, we propose struOKGC, incorporating the structural knowledge in KG into language models to improve KGC under the OWA. Specifically, we first apply the relation-specific template to construct a triplet prompt. Secondly, we capture local structure in KG as knowledge constraints to avoid knowledge confusion. Next, we design a unified prompt for KGC tasks that smoothly concatenates semantic and structural knowledge. Especially for PLMs, we apply a learnable vector to form a soft prompt with a more coherent semantic. Finally, we input the unified prompt into LMs for the openKGC task. Experimental results on benchmark datasets show that our model outperforms SOTA methods and further studies verify the effectiveness of our model.
Loading