Question-Directed Reasoning With Relation-Aware Graph Attention Network for Complex Question Answering Over Knowledge Graph

Published: 01 Jan 2024, Last Modified: 19 Feb 2025IEEE ACM Trans. Audio Speech Lang. Process. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Complex knowledge graph question answering (KGQA) aims at answering natural language questions by entities retrieving from a knowledge graph (KG). Recently, the relation path-based models have shown the unique advantage for complex KGQA. However, these existing models ignore the dependency between different relation paths, which leads to aimless reasoning over the KG. To resolve this issue, we propose the question-directed reasoning with relation-aware graph attention network (QRGAT) that encodes the reasoning process as a reasoning graph. The relation-aware GAT can recognize neighbor entities along with the corresponding relations for each entity. With the relation-aware GAT stacked in multiple layers, it can collaboratively capture the dependency of different relation paths for each entity. The question-directed reasoning utilizes the information learned by the relation-aware GAT to solve the aimless reasoning on the KG by constructing a reasoning graph. Extensive experiments demonstrate that our QRGAT outperforms the baseline models on both popular datasets WebQuestionsSP and ComplexWebQuestions. Compared with the strong GNN-based baseline NSM $_{+h}$ , our QRGAT achieves the performance improvements of 2.3% on WebQuestionsSP and 3.6% on ComplexWebQuestions by the metric Hits@1.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview