Touch and Go: Learning from Human-Collected Vision and TouchDownload PDF

17 May 2022, 02:47 (modified: 12 Oct 2022, 18:59)NeurIPS 2022 Datasets and Benchmarks Readers: Everyone
TL;DR: We introduce “Touch and Go”, a human-collected dataset containing paired visual and tactile data from real-world scenes.
Abstract: The ability to associate sight with touch is essential for understanding material properties, and for physically interacting with the world. Learning these correlations, however, has proven challenging, since existing datasets have not captured the full diversity of these modalities. To address this shortcoming, we propose a dataset for multimodal visuo-tactile learning called Touch and Go, in which human data collectors probe objects in natural environments with tactile sensors, while recording egocentric video. The objects and scenes in our dataset are significantly more diverse than prior efforts, making the data well-suited to tasks that involve understanding material properties and physical interactions in the wild. To demonstrate our dataset's effectiveness, we successfully apply it to a variety of tasks: 1) self-supervised visuo-tactile feature learning, 2) tactile-driven image stylization, i.e., making the visual appearance of an object more consistent with a given tactile signal, and 3) predicting future frames of a tactile signal from visuo-tactile inputs.
Supplementary Material: pdf
URL: https://touch-and-go.github.io/
License: CC BY 4.0
Author Statement: Yes
Contribution Process Agreement: Yes
In Person Attendance: Yes
17 Replies

Loading