Proxy-Normalizing Activations to Match Batch Normalization while Removing Batch DependenceDownload PDF

21 May 2021, 20:46 (modified: 21 Jan 2022, 16:48)NeurIPS 2021 PosterReaders: Everyone
Keywords: Batch-Independent Normalization, Batch Normalization, Layer Normalization, Group Normalization, Instance Normalization, Deep Learning Theory, CNNs, ResNets, ResNeXts, EfficientNets
TL;DR: We introduce a batch-independent normalization that consistently matches batch normalization in both behavior and performance.
Abstract: We investigate the reasons for the performance degradation incurred with batch-independent normalization. We find that the prototypical techniques of layer normalization and instance normalization both induce the appearance of failure modes in the neural network's pre-activations: (i) layer normalization induces a collapse towards channel-wise constant functions; (ii) instance normalization induces a lack of variability in instance statistics, symptomatic of an alteration of the expressivity. To alleviate failure mode (i) without aggravating failure mode (ii), we introduce the technique "Proxy Normalization" that normalizes post-activations using a proxy distribution. When combined with layer normalization or group normalization, this batch-independent normalization emulates batch normalization's behavior and consistently matches or exceeds its performance.
Supplementary Material: pdf
Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
11 Replies