A User Study on Augmented Reality-Based Robot Learning Data Collection Interfaces

Published: 21 Oct 2023, Last Modified: 21 Oct 2023CoRL 2023 Workshop TGR PosterEveryoneRevisionsBibTeX
Keywords: Robot data collection, Human robot interaction
Abstract: Future versatile, generalist robots need the ability to learn new tasks and behaviors from demonstrations. Technologies such as Virtual and Augmented Reality (VR/AR) allow for immersive, visualized environments and settings that accelerate and facilitate the collection of high-quality demonstrations. However, it is so far unclear which interface is the most intuitive and effective for humans to create demonstrations in a virtualized environment. The intuitiveness and efficiency of these interface becomes particularly important when working with non-expert users and complex manipulation tasks. To this end, this work investigates five different interfaces in a comprehensive user study across various virtualized tasks. In addition, this work proposes a so far unexplored interaction interface, the combination of a physical robot for kinesthetic teaching with a virtual environment visualized through augmented reality. The environment, including all objects and a robot manipulator, is virtualized using an AR system. The virtual robot is controlled via various interfaces, i.e., Hand-Tracking, Virtual Kinesthetic Teaching, Gamepad, Motion Controller, Physical Kinesthetic Teaching. This study reveals valuable insights into the usability and effectiveness of these interaction interfaces. It shows that our newly proposed intuitive interface for AR control, i.e., using a physical robot as controller, significantly outperforms other interfaces in terms of success-rates and task completeness. Moreover, the results show that the motion controller and hand-tracking are also promising interfaces, in particular for cases where a physical robot is not available.
Submission Number: 29
Loading