FacialPulse: An Efficient RNN-based Depression Detection via Temporal Facial Landmarks

Published: 20 Jul 2024, Last Modified: 21 Jul 2024MM2024 OralEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Depression is a prevalent mental health disorder that significantly impacts individuals' lives and well-being. Early detection and intervention are crucial for effective treatment and management of depression. Recently, there are many end-to-end deep learning methods leveraging the facial expression features for automatic depression detection. However, most current methods overlook the temporal dynamics of facial expressions. Although very recent 3DCNN methods remedy this gap, they introduce more computational cost due to the selection of CNN-based backbones and redundant facial features. To address the above limitations, by considering the timing correlation of facial expressions, we propose a novel framework called FacialPulse, which recognizes depression with high accuracy and speed. By harnessing the bidirectional nature and proficiently addressing long-term dependencies, the Facial Motion Modeling Module (FMMM) is designed in FacialPulse to fully capture temporal features. Since the proposed FMMM has parallel processing capabilities and has the gate mechanism to mitigate gradient vanishing, this module can also significantly boost the training speed. Besides, to effectively use facial landmarks to replace original images to decrease information redundancy, a Facial Landmark Calibration Module (FLCM) is designed to eliminate facial landmark errors to further improve recognition accuracy. Extensive experiments on the AVEC2014 dataset and MMDA dataset (a depression dataset) demonstrate the superiority of FacialPulse on recognition accuracy and speed, with the average MAE (Mean Absolute Error) decreased by 22\%, and the recognition speed increased by 100\% compared to state-of-the-art baselines.
Primary Subject Area: [Engagement] Emotional and Social Signals
Relevance To Conference: Our work contributes to focusing on the analysis of emotions by integrating facial expression analysis into depression detection. By leveraging facial landmarks and Recurrent Neural Networks (RNNs), our system efficiently analyzes temporal changes in facial expressions, which provides valuable insights into emotional states. It enables the development of more comprehensive and accurate depression detection systems. By integrating facial expression analysis with multimodal processing techniques, this work opens up new possibilities for improving mental health assessment, human-computer interaction, and affective computing applications.
Submission Number: 4581
Loading