Beyond Emotion: A Multi-Modal Dataset for Human Desire UnderstandingDownload PDF

Anonymous

08 Mar 2022 (modified: 05 May 2023)NAACL 2022 Conference Blind SubmissionReaders: Everyone
Paper Link: https://openreview.net/forum?id=0gAU9W2ScBs
Paper Type: Long paper (up to eight pages of content + unlimited references and appendices)
Abstract: Desire is a strong wish to do or have something, which involves not only a linguistic expression, but also underlying cognitive phenomena driving human feelings. As the most primitive and basic human instinct, conscious desire is often accompanied by a range of emotional responses. As a strikingly understudied task, it is difficult for machines to model and understand desire due to the unavailability of benchmarking datasets with desire and emotion labels. To bridge this gap, we present MSED, the first multi-modal and multi-task sentiment, emotion and desire dataset, which contains 9,190 text-image pairs, with English text. Each multi-modal sample is annotated with six desires, three sentiments and six emotions. We also propose the state-of-the-art baselines to evaluate the potential of MSED and show the importance of multi-task and multi-modal clues for desire understanding. We hope this study provides a benchmark for human desire analysis. MSED will be publicly available for research.
Copyright Consent Signature (type Name Or NA If Not Transferrable): Ao Jia
Copyright Consent Name And Address: Beijing Institute of Technology, No. 5 Zhongguancun South Street, Haidian District, Beijing
Presentation Mode: This paper will be presented virtually
Virtual Presentation Timezone: UTC+8
0 Replies

Loading