Enhancing EEG-Based Decision-Making Performance Prediction by Maximizing Mutual Information Between Emotion and Decision-Relevant Features

Published: 01 Jan 2024, Last Modified: 30 Sept 2024IEEE Trans. Affect. Comput. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Emotions are important factors in decision-making. With the advent of brain-computer interface (BCI) techniques, researchers developed a strong interest in predicting decisions based on emotions, which is a challenging task. To predict decision-making performance using emotion, we have proposed the Maximizing Mutual Information between Emotion and Decision relevant features (MMI-ED) method, with three modules: (1) Temporal-spatial encoding module captures spatial correlation and temporal dependence from electroencephalogram (EEG) signals; (2) Relevant feature decomposition module extracts emotion-relevant features and decision-relevant features; (3) Relevant feature fusion module maximizes the mutual information to incorporate useful emotion-related feature information during the decision-making prediction process. To construct a dataset that uses emotions to predict decision-making performance, we designed an experiment involving emotion elicitation and decision-making tasks and collected EEG, behavioral, and subjective data. We performed a comparison of our model with several emotion recognition and motion imagery models using our dataset. The results demonstrate that our model achieved state-of-the-art performance, achieving a classification accuracy of 92.96 $\%$ . This accuracy is 6.83 $\%$ higher than the best-performing model. Furthermore, we conducted an ablation study to demonstrate the validity of each module and provided explanations for the brain regions associated with the relevant features.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview