Towards data-driven sign language interpreting virtual assistantDownload PDF

Anonymous

23 Jun 2021 (modified: 05 May 2023)ACM ICMI 2021 Workshop GENEA Blind SubmissionReaders: Everyone
Keywords: sign language, virtual assistant, generation, HRI
Abstract: Sign Languages (SL) are a form of communication in the visual-gestural modality, and are full-fledged natural languages. Recent years have seen the increase in the use of virtual agents as assistants for the sign language users. Research into sign language recognition has demonstrated promising potential for the improvement of the communication with the deaf people. However, the area of sign language synthesis is still in its infancy. This explains the underdevelopment of the virtual intelligent signing systems, which could bridge the communication with the deaf and make it more favorable. In addition, the existing models are often restricted to the manually written rules and require expert knowledge, while data-driven approach could provide a better solution. In this paper, we present a user study on the evaluation of the data-driven Virtual Assistant that performs signing gestures for the Kazakh-Russian sign language using sign sequences. The study sets out to answer three research questions concerning the users' perceptions and feedback on the performance of the four signing agents, namely two data-driven avatars, one motion capture animation avatar and a human sign interpreter. The results of the questionnaire suggest that while signing avatars generally perform well, they could not outperform the human agent in terms of naturalness and likeability. Hence, a further study might include the improvements necessary to increase the naturalness of the signing gestures.
4 Replies

Loading