Compositional generalization through meta sequence-to-sequence learningDownload PDF

06 Sept 2019 (modified: 05 May 2023)NeurIPS 2019Readers: Everyone
Abstract: People can generalize in systematic ways from just one or a few examples, understanding how to ``blicket twice'' after learning how to ``blicket.'' In contrast, powerful sequence-to-sequence (seq2seq) neural networks can fail such tests of compositionality, especially when composing new concepts together with existing concepts. In this paper, we show that neural networks can be trained to generalize compositionally through meta sequence-to-sequence learning. Neural networks can be trained on a large set of seq2seq problems that require compositional generalization, and then generalize to new seq2seq problems with similar characteristics. This approach solves several of the SCAN tests for compositional learning and can learn to apply rules to variables.
CMT Num: 5200
Code Link: https://github.com/brendenlake/meta_seq2seq
0 Replies

Loading