Equi-normalization of Neural NetworksDownload PDF

Published: 21 Dec 2018, Last Modified: 14 Oct 2024ICLR 2019 Conference Blind SubmissionReaders: Everyone
Abstract: Modern neural networks are over-parametrized. In particular, each rectified linear hidden unit can be modified by a multiplicative factor by adjusting input and out- put weights, without changing the rest of the network. Inspired by the Sinkhorn-Knopp algorithm, we introduce a fast iterative method for minimizing the l2 norm of the weights, equivalently the weight decay regularizer. It provably converges to a unique solution. Interleaving our algorithm with SGD during training improves the test accuracy. For small batches, our approach offers an alternative to batch- and group- normalization on CIFAR-10 and ImageNet with a ResNet-18.
Keywords: convolutional neural networks, Normalization, Sinkhorn, Regularization
TL;DR: Fast iterative algorithm to balance the energy of a network while staying in the same functional equivalence class
Code: [![github](/images/github_icon.svg) facebookresearch/enorm](https://github.com/facebookresearch/enorm)
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/equi-normalization-of-neural-networks/code)
13 Replies

Loading