Tandem Blocks in Deep Convolutional Neural NetworksDownload PDF

15 Feb 2018 (modified: 15 Feb 2018)ICLR 2018 Conference Blind SubmissionReaders: Everyone
Abstract: Due to the success of residual networks (resnets) and related architectures, shortcut connections have quickly become standard tools for building convolutional neural networks. The explanations in the literature for the apparent effectiveness of shortcuts are varied and often contradictory. We hypothesize that shortcuts work primarily because they act as linear counterparts to nonlinear layers. We test this hypothesis by using several variations on the standard residual block, with different types of linear connections, to build small (100k--1.2M parameter) image classification networks. Our experiments show that other kinds of linear connections can be even more effective than the identity shortcuts. Our results also suggest that the best type of linear connection for a given application may depend on both network width and depth.
TL;DR: We generalize residual blocks to tandem blocks, which use arbitrary linear maps instead of shortcuts, and improve performance over ResNets.
Keywords: resnet, residual, shortcut, convolutional, linear, skip, highway
16 Replies

Loading