Compositional Program Generation for Systematic GeneralizationDownload PDF

Published: 16 Jun 2023, Last Modified: 26 Jun 2023IJCAI 2023 Workshop KBCG OralReaders: Everyone
Keywords: machine learning, composition, compositional generalization, program generation, neuro-symbolic
TL;DR: A neuro-symbolic architecture which generalizes systematically and productively given a context-free grammar of the input language and a dictionary mapping each input word to its interpretation in the output language.
Abstract: Compositional generalization remains a difficult problem for neural models. There has been progress but the hardest benchmark problems remain intractable without additional task-specific semantic information. In this paper we describe a neuro-symbolic architecture Compositional Program Generator (CPG) which generalizes systematically and productively for sequence-to-sequence language tasks, given a context-free grammar of the input language and a dictionary mapping each input word to its interpretation in the output language. Our approach learns to generate type-specific symbolic semantic functions composed in an input-dependent way to produce the output sequence. In experiments with SCAN, CPG solves all splits and few-shot generalizes on the systematicity ("add jump") split. On the COGS benchmark the model achieves perfect generalization in 2 epochs of training on sentences of length less than or equal 12.
0 Replies

Loading