FreeTacMan: Robot-free Visuo-Tactile Data Collection System for Contact-rich Manipulation

Published: 01 Aug 2025, Last Modified: 03 Aug 2025CoRL 2025 Demos OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Visuo-tactile Sensing, Data Collection, Robot Manipulation
Abstract: Enabling robots with $\textbf{contact-rich manipulation}$ remains a pivotal challenge in robot learning, which is substantially hindered by the $\textbf{data collection}$ gap, including its inefficiency and limited sensor setup. Traditional teleoperation systems offer no direct, real-time tactile signals, and impose fixed robot setup with complex calibration or high latency. While prior work has explored handheld paradigms, their rod-based mechanical structures remain rigid and unintuitive, providing limited tactile feedback and posing challenges for human operators. Motivated by the dexterity and force feedback of human motion, we propose $\textbf{FreeTacMan}$, a human-centric and robot-free data collection system for accurate and efficient robot manipulation. Concretely, we design a wearable data collection device with dual $\textbf{visuo-tactile}$ grippers, which can be worn by human fingers for intuitive and natural control. A high-precision optical tracking system is introduced to capture end-effector poses while synchronizing visual and tactile feedback. FreeTacMan achieves multiple improvements in data collection performance compared to prior works, and enables effective policy learning for contact-rich manipulation tasks with the help of the visuo-tactile information.
Supplementary Material: zip
Submission Number: 9
Loading