Hierarchical feature distillation model via dual-stage projections and graph embedding label propagation for emotion recognition

Published: 01 Jan 2026, Last Modified: 24 Aug 2025Pattern Recognit. 2026EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: In multi-source domain adaptation, challenges include negative transfer caused by feature coupling and the inefficiency of pseudo-label generation. This paper develops a multi-source domain adaptive framework for EEG-based recognition (MSGELP), which integrates a two-stage projection matrix decoupling mechanism and graph-embedded label propagation. The method employs a dynamic source selection mechanism that adaptively selects the top-K<math><mi is="true">K</mi></math> most similar source domains based on similarity evaluation across target-source domain pairs, while eliminating latent sources of negative transfer. At the feature decoupling level, a learnable two-stage projection matrix, including a global projection matrix and an alignment projection matrix, is designed to explicitly separate cross-domain knowledge: the global projection matrix extracts common feature spanning multiple domains, while the alignment projection matrix captures domain-specific feature of source-target pairs, preserving discriminative information while avoiding feature entanglement. Furthermore, by constructing a similarity graph of source-target domain pairs and iteratively propagating labels, graph embedding techniques, along with iterative updates to the projection matrices, achieve continuous cross-domain knowledge distillation, effectively improving pseudo-label generation accuracy. Finally, rigorous testing of the cross-subject leave-one-subject-out cross-validation strategy on the SEED-IV and SEED-V datasets achieved classification accuracies of 68.70 % and 63.09 %, respectively. Experimental results indicate that the MSGELP effectively learns a shared subspace, mitigates the negative transfer problem, and outperforms state-of-the-art methods. The code is available at https://github.com/czihan1022/MSGELP/.
Loading