everyone
since 13 Oct 2023">EveryoneRevisionsBibTeX
Human languages are largely compositional: sentences derive meanings based on the meanings of constituent words. Conversely, emergent communication systems, learned by unsupervised neural networks, rarely learn human-like compositionality. To encourage compositionality, we propose a new training method that combines information-bottleneck losses with a multi-task framework. By training on a diversity of tasks, we induce task-specific vocabulary; by penalizing complexity, we decrease redundancy and entanglement in communication. Our information-theoretic framing explains results from studies in noisy-channel emergent communication, and outperforms recent population-based training methods. Our work thus address important theoretical questions in compositional communication, and achieves state-of-the-art results.