Predicting the Impressions of Interaction with a Robot from Physical Actions Using AICO-Corpus Annotations

Published: 2023, Last Modified: 26 Aug 2024RO-MAN 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: In many cases of human-human communication, humans interact with others while assuming their emotions and impressions based on not only verbal information but also non-verbal information. Similarly, during the human-robot interaction, predicting the impressions that a person has of the robot is important for the robot to change the behavior and realize good interaction. In this work, we try to use gaze and gesture annotation data in human-robot interaction from AICO-Corpus and show LSTM approach has the potential for the prediction about impressions of interaction with a robot. We also analyzed the types of nonverbal information that influence the impressions towards the robot in English and Japanese respectively.
Loading