Geometry-Aware 6-DoF Grasp Detection in Complex Scenes

20 Sept 2025 (modified: 13 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Robotic manipulation; 6-DoF grasp detection; Transparent object grasping
Abstract: Data-driven methods have made significant progress in 6-DoF grasp detection for robotic applications. However, reliably detecting grasps in cluttered scenes with transparent objects remains a challenge. To address this, we introduce TransCG-Grasp, an annotated extension of the TransCG dataset, to advance research in transparent object grasping. Additionally, we propose GA-Grasp, a novel geometry-aware 6-DoF grasp detection method designed to improve grasping for both transparent and general objects. GA-Grasp incorporates a modality-aware sparse tensor module and a geometry-aware sparse U-Net, leveraging RGB, depth, and surface normals to predict graspable points and generate final 6-DoF grasp poses. Extensive experiments on the TransCG-Grasp and GraspNet-1Billion datasets demonstrate that GA-Grasp outperforms existing methods. Notably, GA-Grasp surpasses the current state-of-the-art (SOTA) by an impressive margin of 10.06% AP on the TransCG-Grasp dataset. In real-world experiments, our GA-Grasp achieves success rates of 82.0% for transparent objects and 90.6% for general objects, with a 100% task completion rate, further validating its effectiveness for real-world robotic manipulation. The codes and trained models will be released upon acceptance.
Supplementary Material: zip
Primary Area: applications to robotics, autonomy, planning
Submission Number: 24194
Loading