Keywords: EEG representation, EEG pretraining
TL;DR: We propose UniEEG, the first electrode-wise time-frequency pretraining model, designed to overcome barriers across diverse tasks and data in EEG modeling
Abstract: Previous electroencephalogram (EEG) models typically exhibit limited performance and generalization by collecting data specifically for targeted EEG tasks. Recognizing this limitation, we propose UniEEG, the first electrode-wise time-frequency pretraining model, designed to overcome barriers across diverse tasks and data in EEG modeling. We collect data from nearly 20 publicly available EEG datasets, including 6 EEG tasks, significantly extending the data volume. The collected EEG data are standardized and split to individual electrodes as the input of UniEEG, enabling full compatibility with diverse EEG data from different acquisition devices and task paradigms. Meanwhile, leveraging a time-frequency transform method, UniEEG adeptly processes EEG signals characterized by signal noises and time delays. In the training phase, we employ an encoder-decoder architecture and a mask signal modeling strategy on time-frequency dimension, learning the electrode-wise universal EEG representation. In the fine-tuning phase, multi-electrode EEG signals from various tasks are consolidated into individual electrodes. The predictions for downstream tasks are then obtained through the pre-trained encoder and an additional prediction module. Furthermore, the proposed UniEEG achieves state-of-the-art performance across different EEG tasks, demonstrating an amazing ability to universal EEG feature representation.
Code, data and models would be available upon acceptance.
Primary Area: applications to neuroscience & cognitive science
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 10379
Loading