VisTac Toward a Unified Multimodal Sensing Finger for Robotic Manipulation

Sheeraz Athar, Gaurav Patel, Zhengtong Xu, Qiang Qiu, Yu She

Published: 15 Oct 2023, Last Modified: 14 Oct 2025CrossrefEveryoneRevisionsCC BY-SA 4.0
Abstract: Tactile sensors are crucial in providing contact geometry information necessary for object manipulation. However, it remains highly nontrivial for a tactile sensor to be able to perceive a distant object before manipulation. In this work, we present an innovative robotic finger, VisTac, which seamlessly combines high-resolution tactile and visual perception in a single unified device, while adhering to essential mechanical constraints, such as a human-finger-like wedge-shaped top, vital for manipulation tasks. Furthermore, we discuss the fabrication of a key component in the device; the semitransparent membrane with light-dependent opacity, thus enabling the contact surface to effortlessly transition between visual and tactile modes. With a compact, two-camera configuration, we demonstrate the sensor’s multimodal sensing capabilities through 3-D reconstruction of the tactile imprints, distant object localization via vision, and pose estimation using both visual and tactile feedback. VisTac is capable of delivering the vision and touch capabilities to accurately locate an object in the 3-D space, using vision, and then adeptly manipulate it according to the task at hand, relying on tactile feedback. To demonstrate the remarkable abilities of the sensor, we carry out a peg-in-hole insertion task. This work primarily seeks to pave the way for future research toward developing unified visual–tactile sensors.
Loading