Probabilistic Closed-Loop Active Grasping

Henry Schaub, Christian Wolff, Maximilian Hoh, Alfred Schöttl

Published: 01 Apr 2024, Last Modified: 03 Mar 2026IEEE Robotics and Automation LettersEveryoneRevisionsCC BY-SA 4.0
Abstract: Picking a specific object is an essential task of assistive robotics. While the majority of grasp detection approaches focus on grasp synthesis from a single depth image or point cloud, this approach is often not viable in an unstructured, uncontrolled environment. Due to occlusion, heavy influence of noise or simply because no collision-free grasp is visible from some perspectives, it is beneficial to collect additional information from other views before opting for grasp execution. We present a closed-loop approach that selects and navigates towards the next-best-view by minimizing the entropy of the volume under consideration. We use a local measure of estimation uncertainty of the surface reconstruction, to sample grasps and estimate their success probabilities in an online fashion. Our experiments show that our algorithm achieves better grasp success rates than comparable approaches, when presented with challenging household objects.
Loading