SayTap: Language to Quadrupedal LocomotionDownload PDF

Published: 30 Aug 2023, Last Modified: 03 Jul 2024CoRL 2023 PosterReaders: Everyone
Keywords: Large language model (LLM), Quadrupedal robots, Locomotion
TL;DR: We propose to use foot contact pattern as an interface to bridge human commands in natural language and low-level commands
Abstract: Large language models (LLMs) have demonstrated the potential to perform high-level planning. Yet, it remains a challenge for LLMs to comprehend low-level commands, such as joint angle targets or motor torques. This paper proposes an approach to use foot contact patterns as an interface that bridges human commands in natural language and a locomotion controller that outputs these low-level commands. This results in an interactive system for quadrupedal robots that allows the users to craft diverse locomotion behaviors flexibly. We contribute an LLM prompt design, a reward function, and a method to expose the controller to the feasible distribution of contact patterns. The results are a controller capable of achieving diverse locomotion patterns that can be transferred to real robot hardware. Compared with other design choices, the proposed approach enjoys more than 50% success rate in predicting the correct contact patterns and can solve 10 more tasks out of a total of 30 tasks. (\url{https://saytap.github.io})
Student First Author: no
Supplementary Material: zip
Instructions: I have read the instructions for authors (https://corl2023.org/instructions-for-authors/)
Website: https://saytap.github.io
Publication Agreement: pdf
Poster Spotlight Video: mp4
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/saytap-language-to-quadrupedal-locomotion/code)
11 Replies

Loading