Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Adjusting for Dropout Variance in Batch Normalization and Weight Initialization
Dan Hendrycks, Kevin Gimpel
Feb 07, 2017 (modified: Mar 15, 2017)ICLR 2017 workshop submissionreaders: everyone
Abstract:We show how to adjust for the variance introduced by dropout with corrections to weight initialization and Batch Normalization, yielding higher accuracy. Though dropout can preserve the expected input to a neuron between train and test, the variance of the input differs. We thus propose a new weight initialization by correcting for the influence of dropout rates and an arbitrary nonlinearity's influence on variance through simple corrective scalars. Since Batch Normalization trained with dropout estimates the variance of a layer's incoming distribution with some inputs dropped, the variance also differs between train and test. After training a network with Batch Normalization and dropout, we simply update Batch Normalization's variance moving averages with dropout off and obtain state of the art on CIFAR-10 and CIFAR-100 without data augmentation.
TL;DR:Batch Norm Incorrectly Estimates Variance When Dropout Is On
Enter your feedback below and we'll get back to you as soon as possible.