Robot Learning with Sensorimotor Pre-trainingDownload PDF

Published: 30 Aug 2023, Last Modified: 03 Jul 2024CoRL 2023 OralReaders: Everyone
Keywords: Robot Learning, Self-supervised, Sensorimotor, Pre-training
Abstract: We present a self-supervised sensorimotor pre-training approach for robotics. Our model, called RPT, is a Transformer that operates on sequences of sensorimotor tokens. Given a sequence of camera images, proprioceptive robot states, and actions, we encode the sequence into tokens, mask out a subset, and train a model to predict the missing content from the rest. We hypothesize that if a robot can predict the masked-out content it will have acquired a good model of the physical world that can enable it to act. RPT is designed to operate on latent visual representations which makes prediction tractable, enables scaling to larger models, and allows fast inference on a real robot. To evaluate our approach, we collected a dataset of 20,000 real-world trajectories over 9 months using a combination of motion planning and grasping algorithms. We find that sensorimotor pre-training consistently outperforms training from scratch, has favorable scaling properties, and enables transfer across different tasks, environments, and robots.
Student First Author: yes
Instructions: I have read the instructions for authors (https://corl2023.org/instructions-for-authors/)
Website: https://robotic-pretrained-transformer.github.io
Code: https://github.com/ir413/rpt
Publication Agreement: pdf
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/robot-learning-with-sensorimotor-pre-training/code)
20 Replies

Loading