GraphThought: Graph Combinatorial Optimization with Thought Generation

ACL ARR 2026 January Submission4422 Authors

05 Jan 2026 (modified: 20 Mar 2026)ACL ARR 2026 January SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: language model, reasoning language model, graph theory, graph combinatorial optimization
Abstract: Graph combinatorial optimization (GCO) is critical across scientific and industrial domains. While Large Language Models (LLMs) show potential for structured reasoning in GCO, they often struggle with rigorous multi-step deduction and produce hallucinations. To address this, we first formalize the Optimal Thoughts Design (OTD) problem to provide structured guidance for intermediate reasoning. Building upon this formulation, we introduce GraphThought, a framework that generates high-quality thought sequences via either heuristic-guided forward search or solver-aligned backward reasoning. Fine-tuned on these sequences yields Llama-GT (8B), which achieves state-of-the-art performance on the GraphArena benchmark, outperforming significantly larger models like DeepSeek-V3. Our results demonstrate that structured reasoning priors can significantly enhance LLM performance on GCO tasks without increasing model scale.
Paper Type: Long
Research Area: Mathematical, Symbolic, Neurosymbolic, and Logical Reasoning
Research Area Keywords: mathematical NLP,reasoning
Contribution Types: Model analysis & interpretability, NLP engineering experiment, Data resources
Languages Studied: English
Submission Number: 4422
Loading