SymQA: Enhanced Knowledge Graph Question Answering with Symbolic Program Generation and Execution

Viet-Nhat Thai, Long Nguyen

Published: 01 Jan 2026, Last Modified: 14 Dec 2025CrossrefEveryoneRevisionsCC BY-SA 4.0
Abstract: Knowledge Graph Question Answering (KGQA) has made great strides through executable logical form semantic parsing, but practical deployment is difficult. While Large Language Models (LLMs) have shown potential in generating logical forms, they have inherent limitations in complex reasoning and knowledge graph grounding. We introduce SymQA, an innovative framework that overcomes these difficulties through an effective combination of symbolic generation of programs with neural models. Our multi-stage workflow leverages specialized neural components: BERT for entity and function extraction, BART for KoPL generation of programs, and DeepSeek-R1 for dynamic reformulation of questions. For cases where direct extraction is not effective, we introduce the Function-Aware Structure-Preserving (FASP) resolver that preserves logical coherence through retrieval from training data of structurally similar questions with preserved function sequences, keywords, and logical relationships. Extensive evaluation on the KQA Pro benchmark establishes our method as state of the art with 94.23% accuracy, with strong performance in complex reasoning cases: 93.53% for multi-hop reasoning, 96.47% for comparison questions, and 95.63% for zero-shot cases. SymQA’s transparent reasoning allows for leverage of both performance and explainability, providing a platform for future practical and robust KGQA systems.
Loading