LogicFlow: Integrating Symbolic Deduction and Gradient-Based Reasoning in Large Language Models

Published: 15 Nov 2025, Last Modified: 08 Mar 2026AAAI 2026 Bridge LMReasoningEveryoneRevisionsBibTeXCC BY 4.0
Keywords: LLM, Reasoning, Logic, Inference
TL;DR: LogicFlow integrates symbolic deduction and differentiable optimization to improve logical consistency and interpretability in large language model reasoning.
Abstract: Recent advances in large language models (LLMs) have demonstrated remarkable reasoning capabilities, yet their internal reasoning processes remain opaque and prone to inconsistency. To address this limitation, we propose \textbf{LogicFlow}, a hybrid framework that unifies symbolic deduction and neural reasoning in LLMs. LogicFlow decomposes each reasoning trace into an explicit logic flow graph, performs differentiable consistency optimization between symbolic logic outcomes and neural predictions, and enables gradient-based refinement of intermediate steps.
Submission Number: 43
Loading