Sequential Normalization: an improvement over Ghost NormalizationDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Withdrawn SubmissionReaders: Everyone
Abstract: Batch normalization (BatchNorm) is an effective yet poorly understood technique for neural network optimization. It is often assumed that the degradation in BatchNorm performance to smaller batch sizes stems from it having to estimate layer statistics using smaller sample sizes. However, recently, Ghost normalization (GhostNorm), a variant of BatchNorm that explicitly uses smaller sample sizes for normalization, has been shown to improve upon BatchNorm in some datasets. Our contributions are: (i) three types of GhostNorm implementations are described, two of which employ BatchNorm as the underlying normalization technique, (ii) we uncover a source of regularization that is unique to GhostNorm, and not simply an extension from BatchNorm, and visualise the difference in their loss landscapes, (iii) we extend GhostNorm and introduce a new type of normalization layer called Sequential Normalization (SeqNorm), (iv) we compare both GhostNorm and SeqNorm against BatchNorm alone as well as with other regularisation techniques, (v) for both GhostNorm and SeqNorm, we report superior performance over state-of-the-art methodologies on CIFAR--10, CIFAR--100, and ImageNet data sets.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Reviewed Version (pdf): https://openreview.net/references/pdf?id=9V3Nzwhbo
5 Replies

Loading