HumanPlus: Humanoid Shadowing and Imitation from Humans

Published: 05 Sept 2024, Last Modified: 15 Oct 2024CoRL 2024EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Humanoids, Learning from Human Data, Whole-Body Control
TL;DR: a full-stack humanoid system for learning motion and autonomous skills from human data
Abstract: One of the key arguments for building robots that have similar form factors to human beings is that we can leverage the massive human data for training.Yet, doing so has remained challenging in practice due to the complexities in humanoid perception and control, lingering physical gaps between humanoids and humans in morphologies and actuation, and lack of a data pipeline for humanoids to learn autonomous skills from egocentric vision. In this paper, we introduce a full-stack system for humanoids to learn motion and autonomous skills from human data. We first train a low-level policy in simulation via reinforcement learning using existing 40-hour human motion datasets. This policy transfers to the real world and allows humanoid robots to follow human body and hand motion in real time using only a RGB camera, i.e. shadowing. Through shadowing, human operators can teleoperate humanoids to collect whole-body data for learning different tasks in the real world. Using the data collected, we then perform supervised behavior cloning to train skill policies using egocentric vision, allowing humanoids to complete different tasks autonomously by imitating human skills. We demonstrate the system on our customized 33-DoF 180cm humanoid, autonomously completing tasks such as wearing a shoe to stand up and walk, folding a sweatshirt, rearranging objects, typing, and greeting another robot with 60-100% success rates using up to 40 demonstrations.
Supplementary Material: zip
Spotlight Video: mp4
Video: https://youtu.be/6r4ZxpJjdx8
Website: https://humanoid-ai.github.io/
Code: https://github.com/MarkFzp/humanplus
Publication Agreement: pdf
Student Paper: yes
Submission Number: 7
Loading