Spatial Computing and Intuitive Interaction: Bringing Mixed Reality and Robotics Together

Published: 01 Jan 2022, Last Modified: 17 Jan 2025IEEE Robotics Autom. Mag. 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Spatial computing—the ability of devices to be aware of their surroundings and to represent this digitally—offers novel capabilities in human–robot interaction. In particular, the combination of spatial computing and egocentric sensing on mixed reality (MR) devices enables robots to capture and understand human behaviors and translate them to actions with spatial meaning, which offers exciting possibilities for collaboration between people and machines. This article presents several human–robot systems that utilize these capabilities to enable novel use cases: mission planning for inspection, gesture-based control, and immersive teleoperation. These works demonstrate the power of MR as a tool for human–robot interaction and the potential of spatial computing and MR to drive future developments.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview