NeuralGrasps: Learning Implicit Representations for Grasps of Multiple Robotic HandsDownload PDF

16 Jun 2022, 10:45 (modified: 15 Nov 2022, 01:42)CoRL 2022 PosterReaders: Everyone
Student First Author: yes
Keywords: Robot Grasping, Neural Implicit Representations, Grasp Transfer, Grasping Contact Modeling, 6D Object Pose Estimation
TL;DR: Learning implicit representations for multiple robot hands in a common encoding space for flexible grasp transfer.
Abstract: We introduce a neural implicit representation for grasps of objects from multiple robotic hands. Different grasps across multiple robotic hands are encoded into a shared latent space. Each latent vector is learned to decode to the 3D shape of an object and the 3D shape of a robotic hand in a grasping pose in terms of the signed distance functions of the two 3D shapes. In addition, the distance metric in the latent space is learned to preserve the similarity between grasps across different robotic hands, where the similarity of grasps is defined according to contact regions of the robotic hands. This property enables our method to transfer grasps between different grippers including a human hand, and grasp transfer has the potential to share grasping skills between robots and enable robots to learn grasping skills from humans. Furthermore, the encoded signed distance functions of objects and grasps in our implicit representation can be used for 6D object pose estimation with grasping contact optimization from partial point clouds, which enables robotic grasping in the real world.
Supplementary Material: zip
Website: https://irvlutd.github.io/NeuralGrasps
Code: https://irvlutd.github.io/NeuralGrasps
9 Replies

Loading