Compositional generalization through abstract representations in human and artificial neural networksDownload PDF

Published: 31 Oct 2022, 18:00, Last Modified: 12 Jan 2023, 14:23NeurIPS 2022 AcceptReaders: Everyone
Keywords: neuroscience, cognition, compositionality, generalization, neural coding, abstraction, representations, human, fMRI, artificial neural networks
TL;DR: We study the impact of abstract representations on compositional generalization in human imaging data and simple artificial neural networks.
Abstract: Humans have a remarkable ability to rapidly generalize to new tasks that is difficult to reproduce in artificial learning systems. Compositionality has been proposed as a key mechanism supporting generalization in humans, but evidence of its neural implementation and impact on behavior is still scarce. Here we study the computational properties associated with compositional generalization in both humans and artificial neural networks (ANNs) on a highly compositional task. First, we identified behavioral signatures of compositional generalization in humans, along with their neural correlates using whole-cortex functional magnetic resonance imaging (fMRI) data. Next, we designed pretraining paradigms aided by a procedure we term primitives pretraining to endow compositional task elements into ANNs. We found that ANNs with this prior knowledge had greater correspondence with human behavior and neural compositional signatures. Importantly, primitives pretraining induced abstract internal representations, excellent zero-shot generalization, and sample-efficient learning. Moreover, it gave rise to a hierarchy of abstract representations that matched human fMRI data, where sensory rule abstractions emerged in early sensory areas, and motor rule abstractions emerged in later motor areas. Our findings give empirical support to the role of compositional generalization in humans behavior, implicate abstract representations as its neural implementation, and illustrate that these representations can be embedded into ANNs by designing simple and efficient pretraining procedures.
Supplementary Material: pdf
20 Replies

Loading