POP-Norm: A Theoretically Justified and More Accelerated Normalization ApproachDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
Abstract: Batch Normalization (BatchNorm) has been a default module in modern deep networks due to its effectiveness for accelerating training deep neural networks. It is widely accepted that the great success of BatchNorm is owing to reduction of internal covariate shift (ICS), but recently it is demonstrated that the link between them is fairly weak. The intrinsic reason behind effectiveness of BatchNorm is still unrevealed that limits it to be made better use. In light of this, we propose a new normalization approach, referred to as Pre-Operation Normalization (POP-Norm), which is theoretically ensured to speed up the training convergence. Not surprisingly, POP-Norm and BatchNorm are largely the same. Hence the similarities can help us to theoretically interpret the root of BatchNorm's effectiveness. There are still some significant distinctions between the two approaches. Just the distinctions make POP-Norm achieve faster convergence rate and better performance than BatchNorm, which are validated in extensive experiments on benchmark datasets: CIFAR10, CIFAR100 and ILSVRC2012.
Keywords: Batch Normalization, Optimization, Accelerate Training
Original Pdf: pdf
4 Replies

Loading