A Demonstration of Natural Language Understanding in Embodied Planning Agents

Published: 04 Jun 2024, Last Modified: 04 Jun 2024ICAPS-24 DemosEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Embodied Planning Agents, Natural Language Understanding
TL;DR: Demonstrating embodied planning agents utilizing LLMs to translate natural language request to PDDL goal to execute plan in the environment.
Abstract: Autonomous agents operating in human worlds must understand and respond to natural language used by humans to communicate their tasks needs. In this paper, we present an approach for language understanding in embodied planning agents. Our approach uses recent advances in large language models to translate human language into a meaningful representation. The representation is further analyzed through grounded reasoning to connect information contained in it with the agent's current beliefs about the state of the world. Grounded reasoning results in a goal description in PDDL which is given to a planner to generate a plan and is executed. We demonstrate our approach on AI2Thor - an interactive, simulated home domain that is becoming a standard benchmark for conversational embodied agents.
Submission Number: 5
Loading