Keywords: Human-Robot Interaction, Teleoperation, Proprioception, Virtual-Reality, Soft-Robotics
TL;DR: SoftBiT enhances soft bimanual robot teleoperation by integrating real-time proprioceptive shape sensing with XR visualization, improving user awareness and control during manipulation tasks, especially under occlusion.
Abstract: Soft robotic teleoperation offers unique advantages for bimanual manipulation, but users often struggle with visual feedback during operation, particularly when robot fingers are occluded by objects. We introduce SoftBiT, a teleoperation interface that enhances user awareness through real-time soft robot finger shape visualization in extended reality (XR). Our system combines proprioceptive sensing with an XR headset (Meta Quest 2) to provide users with intuitive visual feedback about finger deformations during manipulation tasks. SoftBiT's key innovation is a real-time sim-to-real pipeline that estimates and visualizes soft finger shapes, helping users better understand robot-object interactions even when direct visual feedback is limited. Through three representative tasks (pick-and-place, assembly, and object deformation), we demonstrate how augmented proprioceptive feedback supports user decision-making during manipulation. Our shape estimation system achieves 42.55 FPS, enabling smooth real-time visualization. This work lays the foundation for future user studies investigating how proprioceptive augmentation impacts teleoperation performance and user experience, with potential extensions to more complex multi-fingered manipulation tasks.
Submission Number: 4
Loading