Convex Regularization behind Neural ReconstructionDownload PDF

Published: 12 Jan 2021, Last Modified: 05 May 2023ICLR 2021 PosterReaders: Everyone
Keywords: neural networks, image reconstruction, denoising, interpretability, robustness, neural reconstruction, convex duality, inverse problems, sparsity, convex optimization
Abstract: Neural networks have shown tremendous potential for reconstructing high-resolution images in inverse problems. The non-convex and opaque nature of neural networks, however, hinders their utility in sensitive applications such as medical imaging. To cope with this challenge, this paper advocates a convex duality framework that makes a two-layer fully-convolutional ReLU denoising network amenable to convex optimization. The convex dual network not only offers the optimum training with convex solvers, but also facilitates interpreting training and prediction. In particular, it implies training neural networks with weight decay regularization induces path sparsity while the prediction is piecewise linear filtering. A range of experiments with MNIST and fastMRI datasets confirm the efficacy of the dual network optimization problem.
One-sentence Summary: This work proposes a finite-dimensional convex dual of a two-layer fully-convolutional ReLU network for denoising problems, and uses it for interpretation of neural network training and predictions.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Supplementary Material: zip
Data: [MNIST](https://paperswithcode.com/dataset/mnist), [fastMRI](https://paperswithcode.com/dataset/fastmri)
12 Replies

Loading