Topic-Driven Hyper-relational Knowledge Graphs with Adaptive Reconstruction for Multi-hop Question Answering Using LLMs
Abstract: Large language models (LLMs) perform well on single-hop question answering (QA) tasks but face significant challenges in multi-hop QA tasks that require multi-step reasoning across multiple paragraphs. Although prompt-based LLMs leveraging the chain-of-thought (CoT) mechanism have improved multi-hop reasoning, they still suffer from performance degradation and hallucinations when critical knowledge is missing or outdated. To address these challenges, we propose a novel Topic-driven Hyper-relational Knowledge Graph with an Adaptive Reconstruction method that dynamically retrieves and prunes contextually relevant facts. Our approach employs a topic-driven pruning strategy to refine relevant facts and reduce noise, thus improving the reasoning ability of LLMs. Additionally, we introduce an adaptive reconstruction mechanism that dynamically supplements missing knowledge and optimizes the input during the reasoning process, preventing interruptions in reasoning chains. Experiments on the HotpotQA and MuSiQue datasets demonstrate that our approach significantly outperforms state-of-the-art baselines across multiple metrics, achieving superior results and validating its effectiveness in tackling complex multi-hop reasoning challenges.
External IDs:dblp:conf/icann/ZhangCC25
Loading