Keywords: Large Language Model, LLM reasoning, Monte Carlo Tree Search, NLP, Cost-sensitive planning, Tree Search
TL;DR: A novel search algorithm combines LLM reasoning with cost-aware tree search, boosting success and efficiency on budget-constrained planning tasks.
Abstract: While large language models excel at open-ended reasoning, they often struggle with cost-sensitive planning, either treating all actions as having equal cost or failing to stay within strict budgets. In this paper, we introduce Cost-Augmented Tree Search (CATS), a novel search approach that brings explicit cost-awareness into LLM-guided planning. Tight cost constraints push the planner to quickly identify infeasible solutions, while looser constraints encourage optimization for minimal cost. We benchmark top LLMs such as GPT-4.1 and Claude-Opus-4.1 against our CATS planner to evaluate their performance on a cost-augmented variant of BlocksWorld, where each action is assigned a specific budget and tasks must be completed under an overall budget constraint. Our experiments show that raw LLMs, such as Claude-Opus-4.1, often falter under tight budgets, whereas CATS consistently delivers strong performance with higher task success rates and better budget utilization. CATS provides an effective solution for budget-aware planning by combining the reasoning power of LLMs with structured search.
Primary Area: neurosymbolic & hybrid AI systems (physics-informed, logic & formal reasoning, etc.)
Submission Number: 20433
Loading