Normalization-Equivariant Neural Networks with Application to Image Denoising

Published: 21 Sept 2023, Last Modified: 02 Nov 2023NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: equivariance, normalization, image denoising, activation functions, ReLU, interpretability, robustness, deep learning, analysis of neural networks
TL;DR: Replacing the classic “conv+ReLU” scheme by affine convolutions and sort pooling nonlinearities allows normalization-equivariance for the same performance, which is beneficial for increased generalization across noise levels in image denoising.¬
Abstract: In many information processing systems, it may be desirable to ensure that any change of the input, whether by shifting or scaling, results in a corresponding change in the system response. While deep neural networks are gradually replacing all traditional automatic processing methods, they surprisingly do not guarantee such normalization-equivariance (scale + shift) property, which can be detrimental in many applications. To address this issue, we propose a methodology for adapting existing neural networks so that normalization-equivariance holds by design. Our main claim is that not only ordinary convolutional layers, but also all activation functions, including the ReLU (rectified linear unit), which are applied element-wise to the pre-activated neurons, should be completely removed from neural networks and replaced by better conditioned alternatives. To this end, we introduce affine-constrained convolutions and channel-wise sort pooling layers as surrogates and show that these two architectural modifications do preserve normalization-equivariance without loss of performance. Experimental results in image denoising show that normalization-equivariant neural networks, in addition to their better conditioning, also provide much better generalization across noise levels.
Supplementary Material: pdf
Submission Number: 4932
Loading