CLIQ: Contrastive Learning with XAI-guided Interpretation and Model Quantization for EEG-based Emotion Recognition

ICLR 2026 Conference Submission21859 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Emotion recognition, EEG, Self-supervised learning, Contrastive learning, XAI, Quantization, CNN
TL;DR: We introduce CLIQ, a contrastive learning approach with pairing and batching for EEG emotion recognition. We perform quantization, feature analysis and interpretation to identify 2 channels of high emotional difference and reduce data and model size.
Abstract: Electroencephalogram (EEG) may be a promising way to recognize human emotions in contrast to outward expressions, which may be hidden or artificially simulated. This paper applies self-supervised learning (SSL) to process complex EEG signals with low amount of labeled data for solving emotion recognition task. Proposed approach is based on a convolutional encoder with a novel contrastive loss and batching function. It has been evaluated on SEED and DEAP datasets. We also compared different preprocessing techniques in temporal, frequency and temporal-frequency domains. We achieved fairly high accuracy even on small amount of labeled data with the best accuracy of 88.7% and 87,3% on SEED, and 95.3% and 63.1% accuracy on DEAP for subject-dependent and subject-independent evaluations, respectively. Additionaly, we performed feature analysis and found that the greatest inter-emotional difference was shown in the T7 and T8 channels. We validated these findings with an iterative application of DeepLIFT. Combined with model quantization, these insights enabled us to reduce data and model size without significant decrease of accuracy. The proposed approach achieved separable vector representations of EEG and performance compatible with SOTA, enabled insightful data analysis, model interpretation with reasonable data reduction, and efficient model quantization.
Supplementary Material: zip
Primary Area: applications to neuroscience & cognitive science
Submission Number: 21859
Loading