Mind the Motions: Benchmarking Theory‑of‑Mind in Everyday Body Language

ACL ARR 2025 May Submission5839 Authors

20 May 2025 (modified: 03 Jul 2025)ACL ARR 2025 May SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Our ability to interpret others' mental states through nonverbal cues (NVCs) is fundamental to our survival and social cohesion. While existing Theory of Mind (ToM) benchmarks have primarily focused on false-belief tasks and reasoning with asymmetric information, they overlook other mental states beyond belief and the rich tapestry of human nonverbal communication. We present Motion2Mind, a comprehensive framework for evaluating the ToM capabilities of machines in interpreting NVCs. Starting from an FBI agent's validated profile handbook, we develop Motion2Mind, a carefully curated video dataset with fine-grained annotations of NVCs paired with psychological interpretations. It encompasses 222 types of nonverbal cues and 397 mind states. Our evaluation reveals that current AI systems struggle significantly with NVC interpretation, exhibiting not only a substantial performance gap in Detection, but also patterns of over-interpretation in Explanation compared to human annotators.
Paper Type: Long
Research Area: Linguistic theories, Cognitive Modeling and Psycholinguistics
Research Area Keywords: Multimodality and Language Grounding to Vision, Robotics and Beyond, Dialogue and Interactive Systems
Contribution Types: Model analysis & interpretability, Data resources
Languages Studied: English
Submission Number: 5839
Loading