Abstract: Recent research shows a growing interest in adopting touch interaction for robot learning, yet it remains challenging to efficiently acquire high-quality, structured tactile data at a low cost. In this study, we propose the design of vision-based soft robotic tongs to generate reproducible and shareable data of tactile interaction for learning. We further developed a web-based platform for convenient data collection and a portable assembly that can be deployed within minutes. We trained a simple network to infer the 6 D force and torque using relative pose data from markers on the fingers and reached a reasonably high accuracy (an MAE of 0.548 N at 60 Hz within [0,20] N) but cost only 50 USD per set. The recorded tactile data is downloadable for robot learning. We further demonstrated the system for interacting with robotic arms in manipulation learning and remote control. We have open-sourced the system on GitHub with further information. (https://github.com/bionicdl-sustech/SoftRoboticTongs)
Loading