Continual adaptation for efficient machine communicationDownload PDF


16 May 2019 (modified: 15 Jun 2019)ICML 2019 Workshop AMTL Blind SubmissionReaders: Everyone
  • Keywords: communication, adaptation, human-machine interaction, catastrophic forgetting
  • TL;DR: We propose a repeated reference benchmark task and a regularized continual learning approach for adaptive communication with humans in unfamiliar domains
  • Abstract: To communicate with new partners in new contexts, humans rapidly form new linguistic conventions. Recent language models trained with deep neural networks are able to comprehend and produce the existing conventions present in their training data, but are not able to flexibly and interactively adapt those conventions on the fly as humans do. We introduce a repeated reference task as a benchmark for models of adaptation in communication and propose a regularized continual learning framework that allows an artificial agent initialized with a generic language model to more accurately and efficiently understand their partner over time. We evaluate this framework through simulations on COCO and in real-time reference game experiments with human partners.
0 Replies