MiXR-Interact: Mixed Reality Interaction Dataset for Gaze, Hand, and Body

14 Jan 2025 (modified: 27 Feb 2025)HRI 2025 Workshop VAM SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Mixed-Reality, Body tracking, Human-robot interaction
Abstract: This paper presents MiXR-Interact, a dataset providing motion tracking data for users’ interactions in mixed reality (MR) environments, focusing on tracking their gaze, upper body movements, and hand gestures. The dataset is based on the Meta Quest Pro headset, offering an easy-to-use resource for researchers and developers working in MR and human-computer interaction (HCI). MiXR-Interact focuses on collecting natural and precise interactions with virtual objects, with three core interaction types: pushing, pointing, and grasping. To ensure robustness and generalization, each interaction is performed across six distinct directions, reflecting a diverse range of movement trajectories relative to the user’s body. This directional diversity provides critical insights into how users approach and engage with virtual objects from multiple angles. In addition, to precisely track contact points during interactions, 17 key contact points are defined for each direction and are labeled. These contact points are used as reference markers to accurately localize and quantify the joint-to-object contact points for each interaction type and direction. In addition to providing the dataset, this paper evaluates the quality and precision of the collected dataset in MR through a set of evaluation metrics. These metrics assess critical aspects of interaction performance, including Trajectory. Similarity, Joint Orientation, and Joint-to-Contact Alignment. It also details the theoretical and implementation considerations for dataset collection, offering valuable insights for applications in MR and human-robot interaction (HRI).
Submission Number: 1
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview