Interactive Visuo-Tactile Learning to Estimate Properties of Articulated Objects

Published: 29 Oct 2024, Last Modified: 03 Nov 2024CoRL 2024 Workshop MRM-D PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Perception for Grasp and Manipulation, Visuo-Tactile Sensing, Active Learning
TL;DR: The paper presents a novel interactive learning and perception framework for inferring the properties of articulated objects using both vision and tactile sensing seamlessly using versatile push-pull interactions
Abstract: Robotic systems operating in unstructured environments must infer key physical properties of objects, such as stiffness, mass, center of mass, friction, and shape, to ensure stable manipulation. Accurate estimation of these properties is crucial for predicting and effective planning manipulation outcomes. In this work, we present a novel framework for identifying the properties of challenging objects which are articulated through versatile, non-prehensile push-pull actions and using visuo-tactile observation. Our approach introduces a differentiable filtering method that incorporates embedding interaction physics into graph neural networks, enabling the system to actively learn object-robot interactions and consistently infer both directly observable pose information and indirectly observable physical parameters. Experimental results on real robotic systems show that our method outperforms existing baselines in efficiency and accuracy.
Submission Number: 35
Loading