Position: Fodor and Pylyshyn's Legacy — Still No Human-like Systematic Compositionality in Neural Networks

23 Jan 2025 (modified: 18 Jun 2025)Submitted to ICML 2025 Position Paper TrackEveryoneRevisionsBibTeXCC BY 4.0
Abstract: The strength of human language and thought lies in their ability for systematic compositionality: the meaning of a unit (semantics) can be inferred from its structure (syntax). While Fodor and Pylyshyn famously posited that neural networks inherently lack this capacity and in turn are no viable model of the human mind, Lake and Baroni recently presented meta-learning as a pathway to compositionality. In this position paper, we critically evaluate this claim, highlighting limitations in the proposed framework of meta-learning for compositionality (MLC). Specifically, we identify a class of test cases compatible with Lake and Baroni's setup that consistently provoke transduction errors despite falling well within the scope of human-like abilities. We further identify overlooked yet essential elements required for substantive claims of systematic generalization. Therefore, despite the success of neural models in mimicking human behavior, it seems premature to claim that modern architectures have overcome the limitations raised by Fodor and Pylyshyn. This issue is pivotal to the AGI debate, as systematic generalization is crucial for human-like reasoning and adaptability.
Primary Area: Research Priorities, Methodology, and Evaluation
Keywords: meta-learning, compositionality, Fodor and Pylyshyn, systematic generalisation
Submission Number: 298
Loading