BEAT: Berkeley Emotion and Affect Tracking Dataset

Published: 08 Jun 2024, Last Modified: 08 Jun 2024CVPR 2024 Workshop POETS 2nd Round PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Emotion Perception, Dataset, Valence and Arousal
Abstract: Recognizing the emotions of other humans is critical for our lives. Along with the rapid development of robotics, it is also crucial to enable machine recognition of human emotion. Many previous studies have focused on designing automatic emotion perception algorithms to understand the emotions of human characters. Limited by dataset curation procedures and small numbers of annotators, these algorithms heavily rely on facial expressions and fail to accurately reveal various emotional states. In this work, we build the first large video-based Emotion and Affect Tracking Dataset (BEAT) that contains not only facial expressions but also rich contextual information. BEAT has 124 videos involving Hollywood movie cuts, documentaries, and homemade videos, and is annotated with continuous arousal and valence ratings as well as 11 categorical emotional states. We recruited 245 annotators, which guarantees the robustness of our annotations. The emotional annotations of BEAT span a wide range of arousal and valence values and contain various emotion categories. BEAT will be of great benefit to psychology studies on understanding human emotion perception mechanisms and the computer vision community to develop social-aware intelligent machines that are able to perceive human emotions.
Submission Number: 7
Loading