- Abstract: To cooperate with humans effectively, virtual agents need to be able to understand and execute language instructions. A typical setup to achieve this is with a scripted teacher which guides a virtual agent using language instructions. However, such setup has clear limitations in scalability and, more importantly, it is not interactive. Here, we introduce an autonomous agent that uses discrete communication to interactively guide other agents to navigate and act on a simulated environment. The developed communication protocol is trainable, emergent and requires no additional supervision. The emergent language speeds up learning of new agents, it generalizes across incrementally more difficult tasks and, contrary to most other emergent languages, it is highly interpretable. We demonstrate how the emitted messages correlate with particular actions and observations, and how new agents become less dependent on this guidance as training progresses. By exploiting the correlations identified in our analysis, we manage to successfully address the agents in their own language.
- Keywords: Language instructions, multi-agent communication, language emergence, compositionality, grid-world, curriculum learning