From Emotional Mirroring to Emotional Attunement: Do LLMs and Humans Attune to Each Other?

ACL ARR 2026 January Submission3983 Authors

04 Jan 2026 (modified: 20 Mar 2026)ACL ARR 2026 January SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Emotional Attunement, Affective Computing, Social Synchrony, Human-AI Interaction, Large Language Models
Abstract: Much prior work establishes that LLMs effectively mirror the affective state of a user. However, human social interaction depends not on immediate mirroring, but on emotional attunement, a process of bidirectional affective synchronization between individuals. In this work, we evaluate whether LLMs emotionally attune with users, comparing LLM-user interactions with client-therapist interactions. We find evidence for a ``hollow echo'' effect: LLMs strongly mirror user affect in immediate responses, but fail to attune to user emotional state across multi-turn interactions. This contrasts with client-therapist interactions, where we observe a more durable and moderate form of attunement. Moreover, we find that while clients attune to their therapists, users do not attune to LLMs, such that user-LLM attunement cannot be said to be bidirectional. Our findings indicate that current LLMs are inadequate for relationally complex contexts, which require sustained attunement, rather than immediate mirroring.
Paper Type: Short
Research Area: Computational Social Science, Cultural Analytics, and NLP for Social Good
Research Area Keywords: Computational Social Science and Cultural Analytics
Contribution Types: Model analysis & interpretability, Data resources
Languages Studied: English
Submission Number: 3983
Loading