Executable Networks of Thought: Scaling Reasoning with LLM Workflow Template

ICLR 2026 Conference Submission14610 Authors

18 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Executable Workflow, Prompt Engineering, Large Langue Model
TL;DR: We introduce XNoT, a prompt-native framework that compiles LLM plans into executable networks of thought, scaling reasoning by decomposing tasks into elementary steps with explicit dependencies for higher accuracy and lower cost.
Abstract: Past prompt schemes, such as Chain-of-Thought (CoT) and Tree of Thoughts (ToT), either lack modularity or rely on manually engineered, task-specific prompts and fixed solution structures, limiting their scalability. To overcome these limitations, we propose Executable Network of Thoughts (XNoT), a prompt scheme that leverages LLMs’ intrinsic capabilities to autonomously plan and execute reasoning steps from minimal user input. Central to XNoT is the LLM Workflow Template (LWT), a format that supports a network of thought dependencies among sequential elementary steps, enabling XNoT to flexibly adapt to different task complexities and input lengths. XNoT demonstrates superior scalability compared to prior methods. For example, while all methods achieve near 100% accuracy on sorting 16 numbers, XNoT attains 92% on sorting 32 numbers, substantially outperforming CoT (0%) and ToT (12%).
Supplementary Material: zip
Primary Area: foundation or frontier models, including LLMs
Submission Number: 14610
Loading