When And Where Are You Going? A Mixed-Reality Framework for Human Robot CollaborationDownload PDF

19 Feb 2022, 06:20 (modified: 01 Jul 2022, 18:53)VAM-HRI 2022Readers: Everyone
Keywords: Mixed-Reality, HRC, Human Subject Study
Abstract: Fluency and coordination in human-robot collaborative tasks highly depend on shared situational awareness among the interaction partners. This paper sheds light on a work-in-progress framework for Intention Projection (IntPro). To this end, we propose a mixed-reality setup for Intention Projection that combines monocular computer vision with adaptive projection mapping to provide information about the robot's intentions and next actions. This information is projected in the form of visual cues into the environment. A human subject study consisting of a generic joint sorting task is proposed to validate the framework. Here, visual cues about the robot's intentions were provided to the human via mainly two modes, namely a) highlighting the object that the human needs to interact with and b) visualizing the robot's upcoming movements. This work hypothesizes that combining these fundamental modes enables fast and effective signaling, which, in turn, improves task efficiency, transparency, and safety.
3 Replies