Enhancing Participation Experience in VR Live Concerts by Improving Motions of Virtual Audience Avatars

Abstract: While participating in live concerts is a promising application of virtual reality (VR), it falls short of our participation experience in the real world. In particular, to increase the engagement of participants, previous studies emphasized the importance of social experience among audience members, such as the sense of co-presence elicited by sharing physical reactions or body movements synchronized with music. In this respect, a common strategy in existing platforms is to present avatars of remote human participants in a VR venue and make every avatar imitate movements of the corresponding participant. However, this strategy implicitly assumes that a not small number of users connect simultaneously to watch the same content and thus is not applicable when only a few users gather or a user is watching alone. Therefore, with the aim of providing better experience to a user who participates in live concerts as one of the audience, we examine computational approaches to enhancing the sense of co-presence through virtual audience avatars. We propose four methods of presenting avatar movements: copying the user’s own movements, copying other users’ movements, repeating beat-synchronous movements, and synthesizing machine-learning-based movements. We compare their effectiveness in a user experiment and discuss application scenarios and design implications that open up new ways of active media consumption in VR environments.
0 Replies
Loading