Beyond Chains: Bridging Large Language Models and Knowledge Bases in Complex Question Answering

ACL ARR 2025 May Submission3073 Authors

19 May 2025 (modified: 03 Jul 2025)ACL ARR 2025 May SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Knowledge Base Question Answering (KBQA) aims to answer natural language questions using structured knowledge from KBs. While LLM-only approaches offer generalization, they suffer from outdated knowledge, hallucinations, and lack of transparency. Chain-based KG-RAG methods address these issues by incorporating external KBs, but are limited to simple chain-structured questions due to the absence of planning and logical structuring. Inspired by semantic parsing methods, we propose PDRR: a four-stage framework consisting of Predict, Decompose, Retrieve, and Reason. Our method first predicts the question type and decomposes the question into structured triples. Then retrieves relevant information from KBs and guides the LLM as an agent to reason over and complete the decomposed triples. Experimental results show that our proposed KBQA model, PDRR, consistently outperforms existing methods across different LLM backbones and achieves superior performance on various types of questions.
Paper Type: Long
Research Area: Question Answering
Research Area Keywords: retrieval-augmented generation, knowledge base QA, reasoning
Contribution Types: Theory
Languages Studied: English
Keywords: KB question answering, Knowledge Graph, Large Language Models.
Submission Number: 3073
Loading