CNNs efficiently learn long-range dependenciesDownload PDF

Oct 09, 2020 (edited Dec 07, 2020)NeurIPS 2020 Workshop SVRHM Blind SubmissionReaders: Everyone
  • Keywords: feed-forward, CNN, recurrence, feedback
  • TL;DR: CNNs efficiently learn long-range dependencies
  • Abstract: The role of feedback (or recurrent) connections is a fundamental question in neuroscience and machine learning. Recently, two benchmarks [1,2], which require following paths in images, have been proposed as examples where recurrence was considered helpful for efficiently solving them. In this work, we demonstrate that these tasks can be solved equally well or even better using a single efficient convolutional feed-forward neural network architecture. We analyze ResNet training regarding model complexity and sample efficiency and show that a narrow, parameter-efficient ResNet performs on par with the recurrent and computationally more complex hCNN and td+hCNN models from previous work on both benchmarks. Code:
8 Replies