Multi-Agent Graphical Dual-Attention for Dynamic Long-Horizon Strategic Interaction

ACL ARR 2026 January Submission10722 Authors

06 Jan 2026 (modified: 20 Mar 2026)ACL ARR 2026 January SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Conversational modeling, Applications, Multi-agent systems, Agent communication, Agent coordination and negotiation, Graph-based methods
Abstract: Long-horizon strategic interaction in multi-agent settings arises in negotiation dialogues, online communities, collaborative planning, and competitive games, where outcomes depend jointly on linguistic actions, temporal dynamics, and evolving inter-agent relationships. Phenomena such as deception, negotiation, legal and political discourse are central to long term strategic interaction, yet most NLP systems still struggle to recognize these events in free form dialogue that unfolds over many turns and shifting power dynamics. To cater this, we introduce a novel architecture, RG-DAT, a RoBERTa-based multi-agent graphical dual-attention transformer that jointly models message text, agent-state asymmetry features, and a dynamic graph of agent interactions via a graph attention encoder and a dual-attention fusion module. As our primary testbed, we focus on the online negotiation based strategic interaction, Diplomacy using the Diplomacy Deception Dataset, which uniquely annotates both sender intent and receiver perception at the message level. To assess the broader applicability of our approach beyond deception, we additionally evaluate RG-DAT on CaSiNo dataset, a corpus of campsite negotiation dialogues with rich annotations of negotiation outcomes and strategies. Experiments on Diplomacy and CaSiNo show that RG-DAT substantially outperforms strong baselines and contemporary large language models.
Paper Type: Long
Research Area: Dialogue and Interactive Systems
Research Area Keywords: dialogue, conversational modeling, agent communication, agent coordination and negotiation, pragmatic inference and reasoning, discourse-level inference, graph-based methods
Contribution Types: Model analysis & interpretability, NLP engineering experiment, Data analysis
Languages Studied: English
Submission Number: 10722
Loading