Abstract: Detecting human vigilance states (e.g., natural shifts between alertness and drowsiness) from functional magnetic resonance imaging (fMRI) data can provide novel insight into the whole-brain patterns underlying these critical states. Moreover, as a person’s vigilance levels are closely tied to their behavior and brain activity, vigilance state can strongly influence the results of fMRI studies. Therefore, the ability to annotate fMRI scans with vigilance information can also enable clearer and more robust results in fMRI research. However, well-established vigilance indicators are derived from other modalities such as behavioral responses, electroencephalography (EEG), and pupillometry, which are not typically available in fMRI data collection. While previous works indicate the promise of distinguishing vigilance states from fMRI alone, EEG data can provide reliable vigilance indicators that complement and augment fMRI domain information. Here, we propose CBrain: Cross-modal learning for Brain vigilance detection in resting-state fMRI. Our model transfers EEG vigilance information into an fMRI latent space in training, and predicts human vigilance states using only fMRI data in testing, addressing the need for external vigilance indicators. Experimental results demonstrate CBrain’s ability to predict vigilance states across different individuals at a granularity of 10-fMRI-frames with an 81.07% mF1 score on a test set of unseen subjects. Additionally, our generalization experiments highlight the model’s potential to estimate vigilance in an unseen task and in resting-state fMRI scans collected with a different scanner at a different site. Source code: https://github.com/neurdylab/CBrain.
External IDs:dblp:conf/miccai/LiLPZSGBC25
Loading