Learning Open-World Visual-Tactile Grasp Stability Prediction with Synthetic Data

17 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Visual-Tactile Data; FEM-based Simulation; Grasping
Abstract: Grasp stability prediction with visual-tactile data is an important problem in robotics. Most prior work learns these predictors with limited real-world data. Moreover, their evaluation has also been restricted to a simple and unitary laboratory environment. Our work studies open-world visual-tactile grasp stability prediction, i.e. the predictor should zero-shot generalize to novel objects in novel environment. Towards this problem, we propose to learn with synthetic visual-tactile data, generated with FEM-based simulation and ray-tracing rendering. In our experiment, we show that our simulation pipeline has much higher physical fidelity, compared to the rigid-body simulation. Furthermore, the predictor trained on our synthetic dataset has higher accuracy on open-world grasp stability prediction tasks than models trained on real-world dataset or on synthetic dataset from rigid-body simulation.
Primary Area: applications to robotics, autonomy, planning
Submission Number: 9870
Loading