Learning from Humans to Generate Communicative Gestures for Social RobotsDownload PDFOpen Website

Published: 01 Jan 2020, Last Modified: 12 May 2023UR 2020Readers: Everyone
Abstract: Non-verbal behaviors play an essential role in human-human interaction, allowing people to convey their intention and attitudes, and affecting social outcomes. Of particular importance in the context of human-robot interaction is that the communicative gestures are expected to endow social robots with the capability of emphasizing its speech, describing something, or showing its intention. In this paper, we propose an approach to learn the relation between human behaviors and natural language based on a Conditional Generative Adversarial Network (CGAN). We demonstrated the validity of our model through a public dataset. The experimental results indicated that the generated human-like gestures correctly convey the meaning of input sentences. The generated gestures were transformed into the target robot's motion, being the robot's personalized communicative gestures, which showed significant improvements over the baselines and could be widely accepted and understood by the general public.
0 Replies

Loading