Keywords: Communication, Offline multi-agent reinforcement learning
Abstract: Learning effective communication is one of the keys to improving coordination in multi-agent systems. This paper proposes a novel framework for offline multi-agent reinforcement learning that enables agents to learn effective communication from offline datasets that do not include any communication between agents. The proposed framework incorporates an attentional communication network into existing offline multi-agent reinforcement learning algorithms. Our experiments demonstrate the feasibility of learning effective communication from pre-existing datasets. In addition, we provide extensive analysis to examine how learned communication affects performance and to identify the characteristics of environments and datasets that enable effective communication learning.
Primary Area: reinforcement learning
Submission Number: 23349
Loading