Communicating Plans, Not Percepts: Scalable Multi-Agent Coordination with Embodied World Models

Published: 19 Sept 2025, Last Modified: 26 Oct 2025NeurIPS 2025 Workshop EWMEveryoneRevisionsBibTeXCC BY 4.0
Keywords: World Models, Model-Based Reinforcement Learning, Multi-Agent Coordination, Embodied Agents, Planning, Intention Communication, Latent Rollout, Emergent Communication, Sample Efficiency, Scalability, Partial Observability, Goal-Directed Interaction
TL;DR: We demonstrate that for robust multi-agent coordination, agents equipped with a compact world model achieve superior scalability and performance by communicating imagined future plans, outperforming communication protocols learned end-to-end.
Abstract: Robust coordination is critical for effective decision-making in multi-agent systems, especially under partial observability. A central question in Multi-Agent Reinforcement Learning (MARL) is whether to engineer communication protocols or learn them end-to-end. We investigate this dichotomy using embodied world models. We propose and compare two communication strategies for a cooperative task-allocation problem. The first, Learned Direct Communication (LDC), learns a protocol end-to-end. The second, Intention Communication, uses an engineered inductive bias: a compact, learned world model, the Imagined Trajectory Generation Module (ITGM), which uses the agent's own policy to simulate future states. A Message Generation Network (MGN) then compresses this plan into a message. We evaluate these approaches on goal-directed interaction in a grid world, a canonical abstraction for embodied AI problems, while scaling environmental complexity. Our experiments reveal that while emergent communication is viable in simple settings, the engineered, world model-based approach shows superior performance, sample efficiency, and scalability as complexity increases. These findings advocate for integrating structured, predictive models into MARL agents to enable active, goal-driven coordination.
Submission Number: 6
Loading