Leveraging Machine Learning and Threshold-Free Cluster Enhancement to Unravel Perception of Emotion and Implied Movement
Keywords: Electroencephalography, Event-Related Potentials, Implied Motion, Multimodal Deep Learning
TL;DR: Study explores emotional processing with ERPs and subjective ratings in 30 students using stimuli with/without motion. EEG analyzed via TFCE, ANOVA. Deep learning model predicts intensity from EEG features.
Abstract: Understanding the neural mechanisms underlying emotional processing is critical for advancements in emotional neuroscience. This study explores the relationship between emotion and motion perception using Event-Related Potentials (ERPs) in a structured experimental setup. We incorporate subjective intensity ratings to enrich the data by capturing the subjective experiences of participants in response to emotional stimuli with implied motion and no motion. Thirty university students participated in the study, where EEG data was collected and analyzed using threshold-free cluster enhancement (TFCE) for time-domain analysis and Repeated Measures ANOVA for frequency-domain analysis. Furthermore, we developed a multimodal deep learning model to predict subjective intensity levels from EEG-derived features. This model leverages statistical, spectral, and autocovariance features, integrated through a transformer encoder layer, to enhance predictive capability. Our findings contribute to a deeper understanding of emotional processing in the brain and highlight the importance of incorporating subjective measures in neuroscience research.
Track: 4. AI-based clinical decision support systems
Registration Id: PMNJ7LCLRCQ
Submission Number: 292
Loading