CNNs efficiently learn long-range dependenciesDownload PDF

Published: 03 Nov 2020, Last Modified: 05 May 2023SVRHM@NeurIPS PosterReaders: Everyone
Keywords: feed-forward, CNN, recurrence, feedback
TL;DR: CNNs efficiently learn long-range dependencies
Abstract: The role of feedback (or recurrent) connections is a fundamental question in neuroscience and machine learning. Recently, two benchmarks [1,2], which require following paths in images, have been proposed as examples where recurrence was considered helpful for efficiently solving them. In this work, we demonstrate that these tasks can be solved equally well or even better using a single efficient convolutional feed-forward neural network architecture. We analyze ResNet training regarding model complexity and sample efficiency and show that a narrow, parameter-efficient ResNet performs on par with the recurrent and computationally more complex hCNN and td+hCNN models from previous work on both benchmarks. Code: https://eckerlab.org/code/cnn-efficient-path-tracing
8 Replies

Loading