Visual Whole-Body Control for Legged Loco-Manipulation

Published: 05 Sept 2024, Last Modified: 08 Nov 2024CoRL 2024EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Robot Learning; Reinforcement Learning; Imitation Learning; Mobile Loco-Manipulation
Abstract: We study the problem of mobile manipulation using legged robots equipped with an arm, namely legged loco-manipulation. The robot legs, while usually utilized for mobility, offer an opportunity to amplify the manipulation capabilities by conducting whole-body control. That is, the robot can control the legs and the arm at the same time to extend its workspace. We propose a framework that can conduct the whole-body control autonomously with visual observations. Our approach, namely Visual Whole-Body Control (VBC), is composed of a low-level policy using all degrees of freedom to track the body velocities along with the end-effector position, and a high-level policy proposing the velocities and end-effector position based on visual inputs. We train both levels of policies in simulation and perform Sim2Real transfer for real robot deployment. We perform extensive experiments and show significant improvements over baselines in picking up diverse objects in different configurations (heights, locations, orientations) and environments.
Supplementary Material: zip
Video: https://youtu.be/Kfj1ImFLoqI
Website: https://wholebody-b1.github.io/
Code: https://github.com/Ericonaldo/visual_wholebody
Publication Agreement: pdf
Student Paper: yes
Submission Number: 12
Loading