Proving the Lottery Ticket Hypothesis for Convolutional Neural Networks

29 Sept 2021, 00:34 (modified: 15 Mar 2022, 10:50)ICLR 2022 PosterReaders: Everyone
Keywords: lottery ticket hypothesis, convolutional neural network, network pruning, random subset sum, random neural network
Abstract: The lottery ticket hypothesis states that a randomly-initialized neural network contains a small subnetwork which, when trained in isolation, can compete with the performance of the original network. Recent theoretical works proved an even stronger version: every sufficiently overparameterized (dense) neural network contains a subnetwork that, even without training, achieves accuracy comparable to that of the trained large network. These works left as an open problem to extend the result to convolutional neural networks (CNNs). In this work we provide such generalization by showing that, with high probability, it is possible to approximate any CNN by pruning a random CNN whose size is larger by a logarithmic factor.
One-sentence Summary: We prove the lottery ticket hypothesis for convolutional neural networks
Supplementary Material: zip
26 Replies