Abstract: Foundation Models have demonstrated significant success across various domains in Artificial Intelligence (AI), yet their capabilities for brainwave modeling remain unclear. In this paper, we comprehensively evaluate current Large Brainwave Foundation Models (LBMs) through systematic fine-tuning experiments across multiple Brain-Computer Interface (BCI) benchmark tasks, including memory tasks and sleep stage classification. Our extensive analysis shows that state-of-the-art LBMs achieve only marginal improvements (0.5\%) over traditional deep architectures while requiring significantly more parameters (millions vs thousands), raising important questions about their efficiency and applicability in BCI contexts. Moreover, through detailed ablation studies and Low-Rank Adaptation (LoRA), we significantly reduce trainable parameters without performance degradation, while demonstrating that architectural and training inefficiencies limit LBMs' current capabilities. Our experiments span both full model fine-tuning and parameter-efficient adaptation techniques, providing insights into optimal training strategies for BCI applications. We pioneer the application of LoRA to LBMs, revealing that performance benefits generally emerge when adapting multiple neural network components simultaneously. These findings highlight the critical need for domain-specific development strategies to advance LBMs, suggesting that current architectures may require redesign to fully leverage the potential of foundation models in brainwave analysis.
Lay Summary: We tested Large Brainwave Foundation Models (LBMs) on tasks like movement and sleep stage classification. Despite their size, these models only slightly outperformed smaller ones while having used far more resources. By using techniques like LoRA, we reduced training demands without losing accuracy. Our findings suggest current LBMs are inefficient and need further redesigning to fully leverage their potential in brainwave analysis.
Primary Area: Deep Learning->Foundation Models
Keywords: Foundation Models, Large Brainwave Foundation Models, Brain-Computer Interface (BCI), Low-Rank Adaptation (LoRA), Electroencephalogram (EEG)
Submission Number: 10371
Loading