Informed Tree of Thought: Cost-efficient Problem Solving with Large Language Models

Published: 10 Oct 2024, Last Modified: 19 Nov 2024AFM 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Informed Tree of Thought, Multi-step LLM Reasoning, A* Search, D* Lite, Tool Usage, Dynamic Re-planning
TL;DR: iToT, a framework that enhances LLM-based problem-solving by integrating informed search strategies with dynamic re-planning and tool interaction.
Abstract: This paper introduces Informed Tree of Thought (iToT), a novel framework that addresses the challenge of improving the reasoning and dynamic re-planning capabilities of large language models (LLMs) in complex tasks involving external tools. iToT optimizes decision-making by accounting for tool costs and failures by integrating tool usage with informed search algorithms. The framework builds on existing methods like Chain of Thought (CoT) and Tree of Thought (ToT) and extends to iToT-A * and iToT-D * Lite for refinements and efficient task execution. Our solution is evaluated on the HotPotQA dataset, where it outperforms several baselines, including direct prompting and ToT approaches. Through experiments, iToT demonstrates superior performance in handling complex reasoning tasks by minimizing tool costs and effectively managing tool interactions. All methods are implemented using open-source models, ensuring broad accessibility and reproducibility.
Submission Number: 10
Loading