Does Data Scaling Lead to Visual Compositional Generalization?

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Compositional understanding is crucial for human intelligence, yet it remains unclear whether contemporary vision models exhibit it. The dominant machine learning paradigm is built on the premise that scaling data and model sizes will improve out-of-distribution performance, including compositional generalization. We test this premise through controlled experiments that systematically vary data scale, concept diversity, and combination coverage. We find that compositional generalization is driven by data diversity, not mere data scale. Increased combinatorial coverage forces models to discover a linearly factored representational structure, where concepts decompose into additive components. We prove this structure is key to efficiency, enabling perfect generalization from few observed combinations. Evaluating pretrained models (DINO, CLIP), we find above-random yet imperfect performance, suggesting partial presence of this structure. Our work motivates stronger emphasis on constructing diverse datasets for compositional generalization, and considering the importance of representational structure that enables efficient compositional learning.
Lay Summary: Humans easily recognize new combinations of known concepts, like identifying a “green triangle” after seeing only “green squares” and “blue triangles.” We study whether simply increasing the amount of visual data helps AI achieve similar compositional generalization. Our results show that data quantity alone isn’t enough - data diversity is crucial. Only diverse training examples encourage models to form structured internal representations, enabling effective learning of new concept combinations. However, existing pretrained vision models still struggle with compositional generalization, highlighting the importance of structured representations
Link To Code: https://github.com/oshapio/visual-compositional-generalization
Primary Area: General Machine Learning->Representation Learning
Keywords: ML, compositionality, generalization, OOD generalization, scaling, transfer learning
Submission Number: 15013
Loading