360Align: An Open Dataset and Software for Investigating QoE and Head Motion in 360° Videos with Alignment Edits

Published: 01 Jan 2024, Last Modified: 10 Apr 2025IMX 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: This paper presents the resources utilized for and gathered from a subjective QoE experiment conducted with 45 participants who watched 360° videos processed with offline alignment edits. The aim of these edits is to redirect the assumed field of view to a specified region of interest by rotating the 360° video around the horizon. The user experiment involved alignment edits employing both gradual and instant rotation. We employed the Double Stimulus method, whereby participants evaluated each original-processed video pair, resulting in a dataset with 5400 comfort and sense of presence ratings. During video consumption, head motion was recorded from the Meta Quest 2 device. The resulting dataset, containing the original and processed videos, is made publicly accessible. The accompanying web application, developed for the execution of the experiment, is released alongside scripts for the evaluation of head rotation data in a public repository. A cross-analysis of QoE and HM behavior provides insight into the efficacy of alignment edits for attention alignment within the same scene. This comprehensive set of experimental resources establishes a foundation for further research of 360° videos processed with alignment edits.
Loading