Keywords: Foundation model, long sequence modelling, EEG, Automatic sleep staging, deep neural network
TL;DR: We present an EEG foundation model that captures long-range dependencies and reduces implausible sleep-staging errors.
Abstract: In this paper, we investigate a novel EEG foundation model designed to capture long-term temporal dependencies in EEG sequences and reduce implausible predictions in sleep stage annotation.
Unlike existing models that treat short EEG epochs as independent samples, our method aggregates a sequence of pre-tokenized EEG epochs and learns structured dynamics spanning multiple stages.
We adopt a masked language modeling framework, leveraging masked token prediction to enable robust temporal representation learning.
Empirical results on the SHHS dataset show that our model outperforms four state-of-the-art EEG foundation models across standard classification metrics.
Moreover, we introduce a novel metric—Irregular Transition Rate to assess the biological plausibility of stage transitions. Our method significantly reduces ITR to 15.2\%, compared to 29.6\% (BIOT) and 33.7\% (EEGPT), confirming its superior ability to model coherent sleep dynamics.
Submission Number: 72
Loading