Emotion-Assisted multi-modal Personality Recognition using adversarial Contrastive learning

Yongtang Bao, Yuzhen Wang, Yutong Qi, Qing Yang, Ruijun Liu, Liping Feng

Published: 01 May 2025, Last Modified: 06 Nov 2025Knowledge-Based SystemsEveryoneRevisionsCC BY-SA 4.0
Abstract: Highlights•This study first applies adversarial contrastive learning to boost multi-modal personality recognition, improving robustness to noise and ensuring accurate, stable predictions.•The model leverages emotion feature-guided fusion and emotion score decision fusion to enhance the link between emotions and personality traits, boosting recognition performance.•Adversarial contrastive learning enhances feature quality by reducing noise and improving cross-modal personality trait discrimination.•Experiments on ChaLearn First Impressions and ELEA datasets show EAPRC achieves superior accuracy and correlation, validating the effectiveness of the proposed strategies.
Loading