‘EEGReXferNet’ – A Lightweight Gen-AI Framework for EEG Subspace Reconstruction via Cross-Subject Transfer Learning and Channel-Aware Embedding.
Keywords: Gen-AI, EEG, subspace reconstruction, VAE, artifact removal
TL;DR: A lightweight generative AI framework for artifact-resilient EEG subspace reconstruction via cross-subject transfer learning, enhancing fidelity for real-time brain–computer interface applications.
Abstract: Electroencephalography (EEG) is a widely used non-invasive technique for monitoring brain activity, but low signal-to-noise ratios (SNR) due to various artifacts often compromise its utility. Conventional artifact removal methods require manual intervention or risk suppressing critical neural features during filtering/reconstruction. Recent advances in generative models, including Variational Autoencoders (VAEs) and Generative Adversarial Networks (GANs), have shown promise for EEG reconstruction; however, these approaches often lack integrated temporal-spectral-spatial sensitivity and are computationally intensive, limiting their suitability for real-time applications like brain–computer interfaces (BCIs). To overcome these challenges, we introduce EEGReXferNet, a lightweight Gen-AI framework for EEG subspace reconstruction via cross-subject transfer learning - developed using Keras TensorFlow (v2.15.1). EEGReXferNet employs a modular architecture that leverages volume conduction across neighboring channels, band-specific convolution encoding, and dynamic latent feature extraction through sliding windows. By integrating reference-based scaling, the framework ensures continuity across successive windows and generalizes effectively across subjects. This design improves spatial-temporal-spectral resolution (mean PSD correlation $\geq$ 0.95; mean spectrogram RV-Coefficient $\geq$ 0.85), reduces total weights by $\sim45 \\%$ to mitigate overfitting, and maintains computational efficiency for robust, real-time EEG preprocessing in BCI applications. The model and its components are openly shared
to support reproducible research and cross-domain applications.
Submission Number: 37
Loading