Neighbors Always Help: A Relation-aware Knowledge Reasoning Model for Inductive Relation PredictionDownload PDF

Anonymous

16 Feb 2024ACL ARR 2024 February Blind SubmissionReaders: Everyone
Abstract: Inductive relation prediction, an important task for knowledge graph completion, is to predict the relations between entities that are unseen at the training stage. The latest methods use pre-trained language models (PLMs) to encode the paths between the head entity and tail entity and achieve state-of-the-art prediction performance. However, these methods cannot well handle no-path situations and are also unable to learn comprehensive representations for different relations to overcome the difficulty of inductive relation prediction. To tackle this issue, we propose a novel \textbf{R}elation-\textbf{a}ware \textbf{k}nowledg\textbf{e} \textbf{r}easoning model entitled Raker which develops an adaptive reasoning information extraction method to identify relation-aware reasoning neighbors of entities in the target triple to handle no-path situations, and enables PLMs to be aware of the predicted relation by the relation-specific soft prompting. Raker is evaluated on three public datasets and achieves SOTA performance in inductive relation prediction when compared with the baseline methods. Notably, the absolute improvement of Raker is even more than 10\% on the FB15k-237 inductive setting. Moreover, Raker also demonstrates its superiority in both transductive and few-shot settings. The code of Raker will be publicly available after the double-blind review process.
Paper Type: long
Research Area: NLP Applications
Contribution Types: Model analysis & interpretability, NLP engineering experiment
Languages Studied: English
0 Replies

Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview