Keywords: Contrastive Learning, Self-Supervised Learning, EEG Representation Learning, Multivariate Time Series, Regime-Adaptive Learning, Supervised Contrastive Learning, Brain--Computer Interface, Cross-Subject Generalization
Abstract: Contrastive learning methods such as SimCLR (Simple Framework for Contrastive Learning) and SupCon (Supervised Contrastive Learning) are effective across many domains, but their behavior in noisy multivariate EEG time-series remains insufficiently understood. We show that contrastive performance is regime-dependent: instance-based learning is more robust in low-data settings, while supervised contrastive learning becomes more effective as data scale and class coverage increase. Motivated by this, we propose ReACT (Regime-Adaptive Contrastive Training), which dynamically balances instance-level and label-based supervision using training regime, subject identity, and representation consistency. Experiments on EEG motor-imagery benchmarks show that ReACT achieves more stable performance than SimCLR, SupCon, and fixed hybrid objectives, highlighting the importance of adaptive supervision in noisy high-variability time-series domains.
Submission Number: 91
Loading