Investigating explainable human-robot interaction with augmented realityDownload PDF

Published: 04 Mar 2022, Last Modified: 05 May 2023VAM-HRI 2022Readers: Everyone
Keywords: Explainable robotics, Augmented Reality, human-robot interaction, behavioral user studies
TL;DR: We propose and plan to test Explainable AI cues in Augmented Reality as a new feedback modality to give a human teacher access to the robot's situation understanding during teaching by demonstration.
Abstract: In learning by demonstration with social robots, fluid and coordinated interaction between human teacher and robotic learner is particularly critical and yet often difficult to assess. This is even more so if robots are to learn from non-expert users. In such cases, it is sometimes troublesome for the teacher to get a grasp of what the robot knows or to assess if a correct representation of the task has been formed even before the robot demonstrates it back. Here, we introduce a new feedback modality making use of Augmented Reality to visualize the perceptual beliefs of the robot in an interactive way. Such cues are indeed overlaid directly on the shared workspace, as perceived by the teacher, without the need for an explicit inquiry. This allows the teacher to access the robot's situation understanding and adapt their demonstration online, while finally reviewing the observed sequence. We further propose an experimental framework to assess the benefits of such feedback modality - as compared to more established modalities such as gaze and speech - and to collect dyadic data in a quick, integrated, and relatively realistic way. The planned user study will help to assess human-robot coordination across communicative cues and the combination of different modalities for explainable robotics.
3 Replies

Loading