Understanding Touch Through Latent Spaces: Can Images and Haptic Maps Reflect Human Perception?

Antonio Luigi Stefani, Sara Baldoni, Niccolò Bisagno, Federica Battisti, Nicola Conci, Francesco G. B. De Natale

Published: 2025, Last Modified: 15 Apr 2026ICCVW 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Extended Reality (XR) systems are increasingly incorporating multi-sensory stimuli to enhance realism and user immersion. Among these, the integration of tactile feedback plays a crucial role. Yet, the pipeline for acquiring, processing, and rendering haptic information—especially in synchrony with visual stimuli—remains largely unstandardized. A common strategy for capturing tactile data involves encoding it as haptic maps, essentially image-based representations of touch. However, the effectiveness of both visual and tactile modalities in modeling perceptual haptic properties is not yet fully understood. In this study, we analyze the representational power of haptic maps and RGB images from the Touch and Go dataset using latent space analysis. Specifically, we investigate whether a neural network can structure the latent space in a way that reflects human perceptual attributes such as roughness, hardness, and colorfulness. Our findings contribute to understanding whether haptic maps can serve as reliable proxies for tactile data and align with how humans perceive material properties, marking a step forward toward perceptually grounded haptic representations in XR environments.
Loading