Keywords: Robotic Touch, Disentanglement, Shear, Object Reconstruction
Abstract: Robotic touch, particularly when using soft optical tactile sensors, suffers from distortion caused by motion-dependent shear. The manner in which the sensor contacts a stimulus is entangled with the tactile information about the stimulus geometry. In this work, we propose a supervised convolutional deep neural network model that learns to disentangle, in the latent space, the components of sensor deformations caused by contact geometry from those due to sliding-induced shear. The approach is validated by showing a close match between the unsheared images reconstructed from sheared images and their vertical tap (non-sheared) counterparts. In addition, the unsheared tactile images faithfully reconstruct the contact geometry masked in sheared data, and allow robust estimation of the contact pose of use for sliding exploration of various planar shapes. Overall, the contact geometry reconstruction in conjunction with sliding exploration were used for faithful full object reconstruction of various planar shapes. The methods have broad applicability to deep learning models for robots with a shear-sensitive sense of touch.
Supplementary Material: zip