TaKF$^{+}$: A versatile and parameter-efficient tuning for EEG foundation model

27 Sept 2024 (modified: 19 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: EEG, Foundation model, Parameter-efficient fine-tuning, Additive fine-tuning
Abstract: Electroencephalogram (EEG) data, widely used in brain-computer interfaces (BCIs), pose challenges for reusing deep learning models trained on specific datasets due to variations in recording configurations and domain gaps. While foundation models pre-trained on large-scale EEG datasets have emerged as a promising solution, the challenge of effectively adapting them to downstream tasks has yet to be fully explored. To address this, we propose a novel tuning method, TaKF$^{+}$, which consists of the Task-Adaptive Key-Feature Extractor (TaKF) and adapter modules. TaKF$^{+}$ is designed to efficiently extract task-relevant features from EEG foundation models for downstream tasks while preserving the model’s parameters and significantly reducing computational overhead. We evaluate TaKF$^{+}$ across a diverse range of tasks, including motor imagery, emotion recognition, and seizure detection, and demonstrate its superior performance and adaptability compared to existing methods over publicly available datasets. Our research paves the way for more efficient and versatile applications of EEG foundation models across various domains.
Supplementary Material: zip
Primary Area: foundation or frontier models, including LLMs
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9414
Loading