EMP: Emotion-guided Multi-modal Fusion and Contrastive Learning for Personality Traits RecognitionOpen Website

Published: 01 Jan 2023, Last Modified: 20 Jun 2023ICMR 2023Readers: Everyone
Abstract: Multi-modal personality traits recognition aims to recognize personality traits precisely by utilizing different modality information, which has received increasing attention for its potential applications in human-computer interaction. Current methods almost fail to extract distinguishable features, remove noise, and align features from different modalities, which dramatically affects the accuracy of personality traits recognition. To deal with these issues, we propose an emotion-guided multi-modal fusion and contrastive learning framework for personality traits recognition. Specifically, we first use supervised contrastive learning to extract deeper and more distinguishable features from different modalities. After that, considering the close correlation between emotions and personalities, we use an emotion-guided multi-modal fusion mechanism to guide the feature fusion, which eliminates the noise and aligns the features from different modalities. Finally, we use an auto-fusion structure to enhance the interaction between different modalities to further extract essential features for final personality traits recognition. Extensive experiments on two benchmark datasets indicate that our method achieves state-of-the-art performance and robustness.
0 Replies

Loading