A Screen-Based Multimodal Virtual Classroom Interface for Understanding Behavioral Sensory Responses in Autistic Adolescents: A Pilot Study
Abstract: Autism impacts at least 1 in 100 children worldwide, with about 90% experiencing sensory processing difficulties. Virtual Reality (VR), which can precisely deliver controlled sensory stimuli, has emerged as a promising tool for studying sensory experiences. However, VR systems using a head-mounted display may cause discomfort and exacerbate sensory challenges for autistic children. Screen-based VR could offer a viable alternative, but research on designing multimodal sensory delivery systems that simulate real-life experiences remains limited. As a result, the impact of on-screen VR on children’s behavioral sensory responses is not well understood. Therefore, as a pilot study to fill this gap, we designed a novel screen-based Multimodal Virtual Classroom Interface (MVCI) system. MVCI was designed to deliver well-controlled visual, auditory, and tactile stimuli that closely mimic a real classroom environment. The pilot study involved 9 autistic adolescents and 17 typically developing (TD) adolescents, all of whom reported 100% acceptance of the system. Quantitative behavioral analysis demonstrated that, even with the small sample size, the on-screen interaction showed significant differences ( ${p} \lt 0.05$ ) between the two groups in terms of eye gaze, fine motor movements, and eye-hand alignment. Additionally, several behavioral patterns were strongly correlated with participants’ sensory profiles and ADHD symptom severity ( ${p} \lt 0.05$ , ${r}_{s}$ >0.7). Using a novel Fixation Sequence Modeling (FSM) framework, we were able to predict participants’ near-future performance with high accuracy (97-98% proximity) based on their granular behavioral responses.
External IDs:doi:10.1109/tnsre.2025.3619484
Loading