RoBERT: Low-Cost Bi-Directional Sequence Model for Flexible Robot Behavior Control

19 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Imitation Learning, Sequence Model, Transformer, Robotics
Abstract: Requirement of human involvement for data collection or system design has always been a major challenge for building robot control policy. In this paper, we present $\textbf{Ro}$bot-$\textbf{BERT}$ (RoBERT), a method to build general robot control policy for complex behaviors with $\textit{least}$ human effort. Starting from unsupervisedly-collected dataset, RoBERT has no requirements of human labels, high-quality behavior dataset or accurate information of system model, in contrast to most other methods for building general robot agent. RoBERT is further pre-trained via $\textit{Masked Action-Inverse-Inference}$ (MAII), a method inspired by $\textit{Masked Language Modeling}$ (MLM) in BERT-like language models and has potential to enable $\textit{zero-shot}$, $\textit{multi-task}$, $\textit{keyframe-based}$ robot control with little architectural change and user-friendly interface. In our empirical study, RoBERT is successfully applied on various types of robots in simulated environment and could generate stable and flexible behaviors to fulfill complex commands.
Supplementary Material: zip
Primary Area: reinforcement learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 1700
Loading