Lang2LTL: Translating Natural Language Commands to Temporal Specification with Large Language ModelsDownload PDF

Published: 15 Nov 2022, Last Modified: 05 May 2023LangRob 2022 PosterReaders: Everyone
Keywords: Robots, Language, Linear Temporal Logic
TL;DR: We propose a modular approach combined with large language models to translate natural language instructions to a formal specification for the robotic system.
Abstract: Robotic systems interacting with humans through natural language must adequately represent a wide range of challenging tasks. Linear temporal logic (LTL) has become a prevalent specification language to represent challenging non-Markovian tasks, such as completing subtasks in a specific order and repetitive task execution. In this work, we frame the problem of grounding natural language commands to an LTL expression as a neural machine translation problem, leveraging the capabilities of pre-trained large language models (LLMs). A key challenge for translation tasks is the collection of a large corpus of paired language and translated specifications. LLMs have demonstrated few-shot learning capabilities in many natural language tasks and can be used to overcome data-complexity challenges. We propose Lang2LTL, a new model architecture for translating natural language commands to LTL specifications. Results in navigation domains show that our modular approach outperforms our end-to-end baselines in translation accuracy and is more sample efficient than the encoder-decoder baseline at generalization across environments.
3 Replies

Loading