Learning Dexterous Deformable Object Manipulation Through Cross-Embodiment Dynamics Learning

Published: 25 Jun 2025, Last Modified: 25 Jun 2025Dex-RSS-25EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Model Learning, Contact-Rich Manipulation, Cross-Embodiment Learning, Dexterous Manipulation, Deformable Manipulation
TL;DR: We show that a particle-based dynamics model learned from human data can act as a general interface for cross-embodiment dexterous deformable object manipulation.
Abstract: Dexterous manipulation of deformable objects remains a core challenge in robotics due to complex contact dynamics and high-dimensional control. While humans excel at such tasks, transferring these skills to robots is hindered by embodiment gaps. In this work, we propose using particle-based dynamics models as an embodiment-agnostic interface, enabling robots to learn directly from human-object interaction data. By representing both manipulators and objects as particles, we define a shared state and action space across embodiments. Using human demonstrations, we train a graph neural network dynamics model that leverages spatial locality and equivariance to generalize across differing embodiment shapes and structures. For control, we convert embodiment-specific joint actions into particle displacements via forward kinematics, enabling model-based planning in the shared representation space. We demonstrate that our approach transfers manipulation skills from humans to both low-DoF and high-DoF robot hands, achieving real-world clay reshaping without motion retargeting, expert demonstrations, or analytical simulation.
Submission Number: 25
Loading