Keywords: Large Language Models, Classical Planning, Transition Function, State-Action Embeddings, Latent Space, Neural Networks, AI Planning
TL;DR: Embedding states and actions in latent space for efficient and scalable LLM-based planning.
Abstract: Large Language Models (LLMs) excel in natural language processing tasks but face significant challenges in classical planning, where accurate and feasible transitions between states are required. This work presents a novel approach that embeds states and actions into a structured feature space, using a shallow neural network as a transition function. By performing planning in the latent space, the method significantly reduces the computational cost associated with frequent LLM calls while retaining logical consistency. We evaluate this framework as a classifier and demonstrate promising results in state transition prediction and planning tasks across natural language-described domains. The approach offers insights into efficient and scalable LLM-based planning, bridging the gap between natural language understanding and practical planning systems.
Submission Number: 25
Loading