EvoMoE: Evolutionary Mixture-of-Experts for SSVEP-EEG Classification With User-Independent Training

Xiaoli Yang, Yurui Li, Jianyu Zhang, Huiyuan Tian, Shijian Li, Gang Pan

Published: 01 Sept 2025, Last Modified: 09 Nov 2025IEEE Journal of Biomedical and Health InformaticsEveryoneRevisionsCC BY-SA 4.0
Abstract: The analysis of EEG data in BCI systems captures unique individual characteristics, presenting diverse patterns that deviate from conventional identical distribution assumptions. Therefore, applying AI models directly to brain data becomes challenging due to the non-identical distribution issue. Meanwhile, as user numbers in BCI systems rise, scalable models are crucial to handle the growing data volume. Moreover, the limited availability of individual data necessitates the use of collective data for training, requiring models with strong generalization capabilities. To address these challenges, we propose Evolutionary Mixture of Experts (EvoMoE), a framework leveraging a set of diverse experts to model data from individuals. Users with similar distributions are grouped together, allowing experts to handle EEG data with different distribution types. The gating network of EvoMoE selects experts that closely match the distribution of the current sample, effectively tackling non-identical distribution issues. When encountering an unrecognized distribution, a new expert is introduced to accommodate the new data pattern, ensuring model adaptability. Evaluations on two 40-category BCI Speller datasets demonstrate significant performance improvements over state-of-the-art methods. On the BETA dataset, our online EvoMoE achieves 13.06% increase in accuracy and a 27.24-point increase in high information transfer rate (ITR) compared to the online UI method. The Bench dataset shows 3.64% increase in accuracy and a 10.42-point increase in ITR. These qualities make it a promising solution for practical BCI implementation, while setting the stage for the development of comprehensive biological big models.
Loading