High-Precision Object Pose Estimation Using Visual-Tactile Information for Dynamic Interactions in Robotic Grasping
Abstract: In various robotic applications, understanding accurate object poses for robots is essential for high-precision tasks such as factory assembly or daily insertions. Tactile sensing, which compensates for visual information, offers rich texture-based or force-based data for object pose estimation. However, previous methods for pose estimation typically over-look dynamic situations, such as slippage of grasped objects or movement of contacted objects during interactions with the environment, thus increasing the complexity of pose estimation. To address these challenges, we propose an efficient method that utilizes visual and tactile sensing to estimate object poses through particle filtering. We leverage visual information to track the pose of the contacted object in real-time and estimate the pose changes of the grasped object using displacement data obtained from tactile sensors. Our experimental evaluation on 13 objects with diverse geometric shapes demonstrated the ability to estimate high-precision poses, which revealed the robot's powerful ability to cope with dynamic scenes for compelled motion of objects, proving our framework's adaptability in practical scenarios with uncertainty.
External IDs:dblp:conf/icra/PengCCLYY25
Loading