From SCAN to Real Data: Systematic Generalization via Meaningful LearningDownload PDF

Published: 28 Jan 2022, Last Modified: 13 Feb 2023ICLR 2022 SubmittedReaders: Everyone
Keywords: systematic generalization, meaningful learning, inductive learning, deductive learning, data augmentation
Abstract: Humans can systematically generalize to novel compositions of existing concepts. There have been extensive conjectures into the extent to which neural networks can do the same. Recent arguments supported by evidence on the SCAN dataset claim that neural networks are inherently ineffective in such cognitive capacity. In this paper, we revisit systematic generalization from the perspective of meaningful learning, an exceptional capability of humans to learn new concepts by connecting them with other previously known knowledge. We propose to augment a training dataset in either an inductive or deductive manner to build semantic links between new and old concepts. Our observations on SCAN suggest that, following the meaningful learning principle, modern sequence-to-sequence models, including RNNs, CNNs, and Transformers, can successfully generalize to compositions of new concepts. We further validate our findings on two real-world datasets on semantic parsing and consistent compositional generalization is also observed. Moreover, our experiments demonstrate that both prior knowledge and semantic linking play a key role to achieve systematic generalization. Meanwhile, inductive learning generally works better than deductive learning in our experiments. Finally, we provide an explanation for data augmentation techniques by concluding them into either inductive-based or deductive-based meaningful learning. We hope our findings will encourage excavating existing neural networks' potential in systematic generalization through more advanced learning schemes.
One-sentence Summary: Modern sequence-to-sequence models can achieve systematic generalization by meaningful learning through either inductively or deductively established semantic linking.
Supplementary Material: zip
11 Replies

Loading