Abstract: Automatic emotion recognition is important in human-computer interaction (HCI). Although extensive electroencephalography (EEG)-based emotion recognition research has been conducted in recent years, effectively identifying the correlation between EEG signals and emotions remains a challenge. In this study, a new method that combines a novel pre-processing technique with a 3D convolutional neural network (3DCNN)-based classifier is proposed. After the data undergo preprocessing, 3DCNN is used to extract temporal and spatial features from the 2D-map EEG feature sequences. The features are then fed to a fully connected network to obtain binary or multi-category results. Extensive experiments are conducted on the DEAP dataset, and results show that the proposed method surpasses other state-of-the-art methods. The process of selecting the hyper-parameters of 3DCNN is also investigated by comparing three models. Source codes used in this study are available on https://github.com/heibaipei/V-3DCNN.
Loading