On Explainability and Sensor-Adaptability of a Robot Tactile Texture Representation Using a Two-Stage Recurrent Networks

Published: 01 Jan 2021, Last Modified: 11 Feb 2025IROS 2021EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The ability to simultaneously distinguish objects, materials, and their associated physical properties is one fundamental function of the sense of touch. Recent advances in the development of tactile sensors and machine learning techniques allow more accurate and complex modelling of robotic tactile sensations. However, many state-of-the-art (SotA) approaches focus solely on constructing black-box models to achieve ever higher classification accuracy and fail to adapt across sensors with unique spatial-temporal data formats. In this work, we propose an Explainable and Sensor-Adaptable Recurrent Networks (ExSARN) model for tactile texture representation. The ExSARN model consists of a two-stage recurrent networks fed by a sensor-specific header network. The first stage recurrent network emulates our human touch receptors and decouples sensor-specific tactile sensations into different frequency response bands, while the second stage codes the overall temporal signature as a variational recurrent autoen-coder. We infuse the latent representation with ternary labels to qualitatively represent texture properties (e.g. roughness and stiffness), which facilitates representation learning and provide explainability to the latent space. The ExSARN model is tested on texture datasets collected with two different tactile sensors. Our results show that the proposed model not only achieves higher accuracy, but also provides adaptability across sensors with different sampling frequencies and data formats. The addition of the crudely obtained qualitative property labels offers a practical approach to enhance the interpretability of the latent space, facilitate property inference on unseen materials, and improve the overall performance of the model.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview