Data-driven emergence of convolutional structure in neural networksDownload PDF

Published: 07 Nov 2022, Last Modified: 05 May 2023NeurReps 2022 PosterReaders: Everyone
Keywords: convolution, invariance, receptive fields
TL;DR: Fully-connected neural networks can learn a convolutional structure from scratch.
Abstract: Exploiting data invariances is crucial for efficient learning in both artificial and biological neural circuits, but can neural networks learn apposite representations from scratch? Convolutional neural networks, for example, were designed to exploit translation symmetry, yet learning convolutions directly from data has so far proven elusive. Here, we show how initially fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs, resulting in localised, space-tiling receptive fields that match the filters of a convolutional network trained on the same task. By carefully designing data models for the visual scene, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs, which has long been recognised as the hallmark of natural images. We provide an analytical and numerical characterisation of the pattern-formation mechanism responsible for this phenomenon in a simple model and find an unexpected link between receptive field formation and tensor decomposition of higher-order input correlations.
4 Replies

Loading