Transformer-Based Domain Adaptation for Multi-Modal Emotion Recognition in Response to Game Animation Videos

Published: 01 Jan 2023, Last Modified: 30 Sept 2024BIBM 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Emotion recognition researches necessitate the strategic selection of stimuli to evoke targeted emotions for robust physiological analyses. This study pioneers the use of Genshin Impact game animation videos to induce positive and neutral emotional states. Notably, it introduces a Transformer-based feature extractor, enhancing Domain-Adversarial Neural Networks (DANN) to advance domain adaptation capabilities. Leveraging the inherent advantages of the Transformer architecture, including parallel processing and handling intricate time-series Electroencephalogram (EEG) and eye movement data, this innovation is reinforced by a discriminator with gradient reversal layers, harmonizing source and target domain distributions. Empirical results demonstrate the effectiveness of the Transformer-based DANN model in cross-subject multimodal emotion recognition, achieving a remarkable prediction accuracy of 83.38% across 59 subjects.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview