Compositional Continual Language Learning

Anonymous

Sep 25, 2019 ICLR 2020 Conference Blind Submission readers: everyone Show Bibtex
  • Abstract: Motivated by the human's ability to continually learn and gain knowledge over time, several research efforts have been pushing the limits of machines to constantly learn while alleviating catastrophic forgetting; significant drop of a machine skill accessed/gained far earlier in time. Most of the existing methods have been focusing on label prediction tasks to study continual learning. Humans, however, naturally interact and learn from natural language statements and instructions which is far less studied from continual learning angle. One of the key skills that enables humans to excel at learning language efficiently is ability to produce novel composition. To learn and complete new tasks, robots need to continually learn novel objects and concepts in a linguistic form which requires compositionality for efficient learning. Inspired by that, in this paper, we propose a method for compositional continual learning of sequence-to-sequence models. Experimental results show that the proposed method has significant improvement over state of the art methods, and it enables knowledge transfer and prevents catastrophic forgetting, resulting in more than 85% accuracy up to 100 stages, compared with less 50% accuracy for baselines. It also shows significant improvement in a machine translation task. This is the first work to combine continual learning and compositionality for natural language instruction learning, and we hope this work will make robots more helpful in various tasks.
  • Code: https://drive.google.com/file/d/1tHSl8Z9tZIGJkBDELKYavfSfcQ62JAor/view?usp=sharing
  • Keywords: Compositionality, Continual Learning, Lifelong Learning, Sequence to Sequence Modeling
0 Replies

Loading