KEMRP: Can a Knowledge Graph Enhance the Ability of LLMs to Solve Math Word Problems?

ACL ARR 2025 February Submission2978 Authors

15 Feb 2025 (modified: 09 May 2025)ACL ARR 2025 February SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: This paper introduces a novel framework that combines Large Language Models (LLMs) with mathematical knowledge graphs (KGs) to solve Math Word Problems (MWPs). Current methods leveraging LLMs for MWPs primarily rely on fine-tuning or prompt engineering; the former operates as a black box with limited interpretability, while the latter completely depends on the inherent abilities of LLMs. In contrast, our approach enables explicit and interpretable mathematical reasoning by dynamically linking linguistic patterns to structured mathematical knowledge. We present two comprehensive knowledge graphs—MWPEN-KG (English) and MT700-KG (Chinese)—that capture essential mathematical concepts and relationships for problem-solving. The framework leverages LLMs to decompose problems into mathematical concepts while simultaneously accessing relevant entities and paths in KG to guide step-by-step solutions. Extensive experiments across five MWP benchmarks (MAWPS, MathQA, Math23K, Ape210k, CM17K) using four different LLMs (DeepSeek-Chat, GPT-4o, GPT-3.5-Turbo, Qwen-Turbo) reveal the framework's superior performance compared to conventional methods, achieving state-of-the-art results on all five datasets. Our work demonstrates that combining LLMs with KGs has significant potential in solving MWPs.
Paper Type: Long
Research Area: NLP Applications
Research Area Keywords: educational applications, mathematical NLP, knowledge graphs
Contribution Types: NLP engineering experiment, Data resources
Languages Studied: English, Chinese
Submission Number: 2978
Loading