Abstract: Mixed Reality (MR) has launched from science fiction but its entrance in reality can reshape our society. By combining the physical and virtual worlds, it provides novel ways of immersive interactions and experiences and by these means enables a new generation of applications. The most exciting and challenging ones support collaborative, multi-user operation in large geographical scale, require real-time environment comprehension and high visual fidelity. The success or failure is definitely impacted by the capabilities and performance limits of edge cloud platforms and 5G/6G networks providing the offloading features for CPU/GPU intensive MR functions. In addition, the desired quality of user experience calls for further mechanisms at the application level hiding the consequences of varying network characteristics. In this paper, we propose a novel edge cloud based architecture for future remote-rendered MR applications supporting low-latency immersive interactions. Our contribution is threefold. First, the system architecture is presented focusing on the remote rendering, 3D simulation and environment detection control loops. Second, we highlight the main features of our proof-of-concept prototype and our dedicated application, namely the Mixed Reality version of a Rocket League inspired game. Third, the concepts are validated via experiments in a Beyond 5G infrastructure where we analyze the operation and latency characteristics of the overall system. In addition, the quality of the user experience is also evaluated via real-life experiments conducted as part of a student competition. The results show that the latency and jitter characteristics of the most sensitive render loop can be managed efficiently together by a network-level control (slice priorities) and an application-level (dynamic jitter buffer) mechanism.
External IDs:dblp:conf/mmsys/DokaNJFVRGS25
Loading