AnySkin: Plug-and-play Skin Sensing for Robotic Touch

Published: 10 Nov 2024, Last Modified: 10 Nov 2024CoRL-X-Embodiment-WS 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Tactile Sensing, Soft Robotics
Abstract: While tactile sensing is widely accepted as an important and useful sensing modality, its use pales in comparison to other sensory modalities like vision and proprioception. AnySkin addresses the critical challenges that impede the use of tactile sensing – versatility, replaceability, and data reusability. Building on the simple design of ReSkin, and decoupling the sensing electronics from the sensing interface, AnySkin makes integration as straightforward as putting on a phone case and connecting a charger. Furthermore, AnySkin is the first uncalibrated tactile-sensor to report cross-instance generalizability of learned manipulation policies. To summarize, this work makes three key contributions: first, we introduce a streamlined fabrication process and a design tool for creating an adhesive-free, durable and replaceable magnetic tactile sensor; second, we characterize slip detection and policy learning with the AnySkin sensor; third, we demonstrate zero- shot generalization of models trained on one instance of AnySkin to new instances, and compare it with popular existing tactile solutions like DIGIT and ReSkin. Videos and details can be found on https://anon-anyskin.github.io/.
Previous Publication: No
Submission Number: 11
Loading