Compositional Generalization from First Principles

Published: 21 Sept 2023, Last Modified: 02 Nov 2023NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: compositional generalization, compositionality, generalization, combinatorial generalization, out-of-distribution, out-of-domain, identifiability, disentanglement, object-centric learning, DSprites
TL;DR: We derive sufficient conditions on the data generating process and model architecture under which compositional generalization provably occurs.
Abstract: Leveraging the compositional nature of our world to expedite learning and facilitate generalization is a hallmark of human perception. In machine learning, on the other hand, achieving compositional generalization has proven to be an elusive goal, even for models with explicit compositional priors. To get a better handle on compositional generalization, we here approach it from the bottom up: Inspired by identifiable representation learning, we investigate compositionality as a property of the data-generating process rather than the data itself. This reformulation enables us to derive mild conditions on only the support of the training distribution and the model architecture, which are sufficient for compositional generalization. We further demonstrate how our theoretical framework applies to real-world scenarios and validate our findings empirically. Our results set the stage for a principled theoretical study of compositional generalization.
Supplementary Material: zip
Submission Number: 15124
Loading