Computational Separation Between Convolutional and Fully-Connected NetworksDownload PDF

28 Sep 2020 (modified: 25 Jan 2021)ICLR 2021 PosterReaders: Everyone
  • Keywords: Neural Networks, Deep Learning, Convolutional Networks, Fully-Connected Networks, Gradient Descent
  • Abstract: Convolutional neural networks (CNN) exhibit unmatched performance in a multitude of computer vision tasks. However, the advantage of using convolutional networks over fully-connected networks is not understood from a theoretical perspective. In this work, we show how convolutional networks can leverage locality in the data, and thus achieve a computational advantage over fully-connected networks. Specifically, we show a class of problems that can be efficiently solved using convolutional networks trained with gradient-descent, but at the same time is hard to learn using a polynomial-size fully-connected network.
  • Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
  • One-sentence Summary: We show a computational separation between convolutional and fully-connected networks, proving that the former can leverage strong local structure in the data.
9 Replies