Abstract: Highlights • Learning and generation of multi-modal feature representations. • Multi-modal interactive fusion for adolescent stress detection. • Multi-modalities enable better performance than singular modality. Abstract Psychological stress turns out to be increasingly severe among teenagers and has imposed numerous physical and mental issues on them. The earlier the stress is detected, the better it can be effectively managed and alleviated. Smart phones, having taken up an integral part of our daily lives, can act as a way to monitor and collect people’s daily behaviors and help people manage stress. In the present study, a Multi-modal Interactive Fusion Method (MIFM) was proposed to detect psychological stress from three types of data (namely, texts, images, sleep and exercise data) harvested using a self-developed mobile app termed as Happort. The method characterized the associations between each two modalities and calculated the contribution of each modality via two attention mechanisms. As revealed from our experimental results, fusing multi-modalities for stress detection exhibits consistently higher performance than using single-modality.
0 Replies
Loading