Raker: A Relation-aware Knowledge Reasoning Model for Inductive Relation Prediction

ACL ARR 2024 June Submission1334 Authors

14 Jun 2024 (modified: 02 Aug 2024)ACL ARR 2024 June SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Inductive relation prediction, an important task for knowledge graph completion, is to predict the relations between entities that are unseen at the training stage. The latest methods use pre-trained language models (PLMs) to encode the paths between the head entity and tail entity and achieve state-of-the-art prediction performance. However, these methods cannot well handle no-path situations and are also unable to learn comprehensive relation representations for distinguishing different relations to overcome the difficulty of inductive relation prediction. To tackle this issue, we propose a novel \textbf{R}elation-\textbf{a}ware \textbf{k}nowledg\textbf{e} \textbf{r}easoning model entitled Raker which introduces an adaptive reasoning information extraction method to identify relation-aware reasoning neighbors of entities in the target triple to handle no-path situations, and enables the PLM to be more aware of the possible relations by the relation-specific soft prompting. Raker is evaluated on three public datasets and achieves SOTA performance in inductive relation prediction when compared with the baseline methods. Notably, the absolute improvement of Raker is even more than 10\% on the FB15k-237 dataset in the inductive setting. Moreover, Raker also demonstrates its superiority in transductive and few-shot settings. The code of Raker is available at \href{https://anonymous.4open.science/r/Raker-9234}{https://anonymous.4open.science/r/Raker-9234}.
Paper Type: Long
Research Area: NLP Applications
Research Area Keywords: knowledge graphs
Contribution Types: Model analysis & interpretability, NLP engineering experiment
Languages Studied: English
Submission Number: 1334
Loading