A scalable self-supervised method for modeling human intracranial recordings during natural behavior
Keywords: Self-supervised learning, Human neuroscience, Naturalistic behavior, ECoG, sEEG, iEEG
TL;DR: We introduce a self-supervised masked modeling framework that learns from multi-participant intracranial recordings during naturalistic behavior, enabling scalable decoding of human behavior, auditory processing, and language.
Abstract: Understanding how the brain supports natural behavior is an increasingly central goal in human neuroscience. Recordings from human neurosurgical patients with intracranial EEG electrodes offer direct access to widespread brain electrical activity during a variety of behaviors over extended times. Despite the progress in the field, utilizing these recordings at scale to identify the neural underpinnings of natural human behavior remains difficult due to variability in electrode placement, channel geometry, and behavioral diversity across participants and sessions. To address these challenges, we introduce a self-supervised framework for multi-participant intracranial neural data. We use a Perceiver-based architecture to reconstruct masked channels of neural activity from unmasked channels using learnable embeddings of the channel identity and contextual information, capturing inter-channel dependencies without requiring labels. Finetuning of our self-supervised model has improved the decoding performance on a panel of downstream tasks, highlighting the potential of self-supervised learning to enable general-purpose neural decoding and support scalable integration of naturalistic human brain recordings.
Submission Number: 76
Loading