Keywords: Mixed-Reality, Body tracking, Human-robot interaction
Abstract: This paper presents MiXR-Interact, a dataset providing motion tracking data for users’ interactions in mixed reality (MR) environments, focusing on tracking their gaze, upper body movements, and hand gestures. The dataset is based on the Meta Quest Pro headset, offering an easy-to-use resource for researchers and developers working in MR and human-computer interaction (HCI). MiXR-Interact focuses on collecting natural and precise interactions with virtual objects, with three core interaction types: pushing, pointing, and grasping. To ensure robustness and generalization, each interaction is performed across six distinct directions, reflecting a diverse range of movement trajectories relative to the user’s body. This directional diversity provides critical insights into how users approach and engage with virtual objects from multiple angles. In addition, to precisely track contact points during interactions, 17 key contact points
are defined for each direction and are labeled. These contact points are used as reference markers to accurately localize and
quantify the joint-to-object contact points for each interaction type and direction. In addition to providing the dataset, this
paper evaluates the quality and precision of the collected dataset in MR through a set of evaluation metrics. These metrics assess
critical aspects of interaction performance, including Trajectory. Similarity, Joint Orientation, and Joint-to-Contact Alignment. It
also details the theoretical and implementation considerations for dataset collection, offering valuable insights for applications in
MR and human-robot interaction (HRI).
Submission Number: 1
Loading