Abstract: Interactions with Extended Reality Head-Mounted Displays (XR HMDs) require precise, intuitive, and efficient input methods. Current approaches either rely on power-intensive sensors, such as cameras for hand tracking, or specialized hardware such as controllers. Previous work has explored the use of familiar, available devices such as smartphones and smartwatches as more a more practical input alternative. However, this approach risks interaction overload – how can one determine whether the user’s gestures on the watch or phone are directed toward control of the XR device or the mobile device itself? To this end, we propose a novel method for cross-device input arbitration based on the relative orientation between the HMD and target device as measured by on-device IMUs. In a validation study with 6 users, we demonstrate 93.7% accuracy in estimating the intended device of interaction. Our method offers a practical, energy-efficient way to leverage users’ existing devices for input and enable seamless cross-device experiences in XR.
Loading