Data augmentation instead of explicit regularizationDownload PDF

25 Sept 2019 (modified: 22 Oct 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
TL;DR: Deep neural networks trained with data augmentation do not require any other explicit regularization (such as weight decay and dropout) and exhibit greater adaptaibility to changes in the architecture and the amount of training data.
Abstract: Modern deep artificial neural networks have achieved impressive results through models with orders of magnitude more parameters than training examples which control overfitting with the help of regularization. Regularization can be implicit, as is the case of stochastic gradient descent and parameter sharing in convolutional layers, or explicit. Explicit regularization techniques, most common forms are weight decay and dropout, have proven successful in terms of improved generalization, but they blindly reduce the effective capacity of the model, introduce sensitive hyper-parameters and require deeper and wider architectures to compensate for the reduced capacity. In contrast, data augmentation techniques exploit domain knowledge to increase the number of training examples and improve generalization without reducing the effective capacity and without introducing model-dependent parameters, since it is applied on the training data. In this paper we systematically contrast data augmentation and explicit regularization on three popular architectures and three data sets. Our results demonstrate that data augmentation alone can achieve the same performance or higher as regularized models and exhibits much higher adaptability to changes in the architecture and the amount of training data.
Code: https://www.dropbox.com/sh/ki0syyl0bvp29rl/AABmmbqC-Ft86rzJ5WyUTo4da?dl=0
Keywords: data augmentation, implicit regularization, explicit regularization, object recognition, convolutional neural networks
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 4 code implementations](https://www.catalyzex.com/paper/arxiv:1806.03852/code)
Original Pdf: pdf
14 Replies

Loading