Keywords: In-Hand Object Rotation, Tactile Sensing, Reinforcement Learning, Sim2Real, Transformer, Visuotactile Manipulation
TL;DR: We present a reinforcement learning policy capable of rotating a diverse set of objects over multiple axes using its fingertips.
Abstract: We introduce Rotateit, a system that enables fingertip-based object rotation along multiple axes by leveraging multimodal sensory inputs. Our system is trained in simulation, where it has access to ground-truth object shapes and physical properties. Then we distill it to operate on realistic yet noisy simulated visuotactile and proprioceptive sensory inputs. These multimodal inputs are fused via a visuotactile transformer, enabling online inference of object shapes and physical properties during deployment. We show significant performance improvements over prior methods and highlight the importance of visual and tactile sensing.
Student First Author: yes
Instructions: I have read the instructions for authors (https://corl2023.org/instructions-for-authors/)
Website: https://haozhi.io/rotateit/
Publication Agreement: pdf
Video: https://www.youtube.com/watch?v=Uh-ltingRzk
Poster Spotlight Video: mp4
14 Replies
Loading