On the Generalization Effects of DenseNet Model Structures Download PDF

15 Feb 2018 (modified: 15 Feb 2018)ICLR 2018 Conference Blind SubmissionReaders: Everyone
Abstract: Modern neural network architectures take advantage of increasingly deeper layers, and various advances in their structure to achieve better performance. While traditional explicit regularization techniques like dropout, weight decay, and data augmentation are still being used in these new models, little about the regularization and generalization effects of these new structures have been studied. Besides being deeper than their predecessors, could newer architectures like ResNet and DenseNet also benefit from their structures' implicit regularization properties? In this work, we investigate the skip connection's effect on network's generalization features. Through experiments, we show that certain neural network architectures contribute to their generalization abilities. Specifically, we study the effect that low-level features have on generalization performance when they are introduced to deeper layers in DenseNet, ResNet as well as networks with 'skip connections'. We show that these low-level representations do help with generalization in multiple settings when both the quality and quantity of training data is decreased.
TL;DR: Our paper analyses the tremendous representational power of networks especially with 'skip connections', which may be used as a method for better generalization.
Keywords: Skip connection, generalization, gegularization, deep network, representation.
4 Replies

Loading