Complexity-Limited Multi-Task Training for Compositional Emergent Communication

22 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Emergent communication, information bottleneck, compositionality
TL;DR: By training in a multi-task environment, with pressures to reduce communication complexity, agents learn more compositional emergent communication.
Abstract: Human languages are largely compositional: sentences derive meanings based on the meanings of constituent words. Conversely, emergent communication systems, learned by unsupervised neural networks, rarely learn human-like compositionality. To encourage compositionality, we propose a new training method that combines information-bottleneck losses with a multi-task framework. By training on a diversity of tasks, we induce task-specific vocabulary; by penalizing complexity, we decrease redundancy and entanglement in communication. Our information-theoretic framing explains results from studies in noisy-channel emergent communication, and outperforms recent population-based training methods. Our work thus address important theoretical questions in compositional communication, and achieves state-of-the-art results.
Supplementary Material: pdf
Primary Area: general machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5675
Loading