PLUGIn-CS: A simple algorithm for compressive sensing with generative priorDownload PDF

Published: 19 Oct 2021, Last Modified: 05 May 2023NeurIPS 2021 Deep Inverse Workshop PosterReaders: Everyone
Keywords: deep generative model, inverse problem, compressive sensing, gradient type method, non-convex optimization
Abstract: We consider the problem of recovering an unknown latent code vector under a known generative model from compressive measurements. For a $d$-layer deep generative network $\mathcal{G}:\mathbb{R}^{n_0}\rightarrow \mathbb{R}^{n_d}$ with ReLU activation functions and compressive measurement matrix $\Phi \in \mathbb{R}^{m\times n_d}$, let the observation be $\Phi\mathcal{G}(x)+\epsilon$ where $\epsilon$ is noise. We introduce a simple novel algorithm, Partially Linearized Update for Generative Inversion in Compressive Sensing (PLUGIn-CS), to estimate $x$ (and thus $\mathcal{G}(x)$). We prove that, when sensing matrix and weights are Gaussian, if layer widths $n_i \gtrsim 5^i n_0$ and number of measurements $m \gtrsim 2^dn_0$ (both up to log factors), then the algorithm converges geometrically to a (small) neighbourhood of $x$ with high probability. Note the inequality on layer widths allows $n_i>n_{i+1}$ when $i\geq 1$ and thus allows the network to have some contractive layers. After a sufficient number of iterations, the estimation errors for both $x$ and $\mathcal{G}(x)$ are at most in the order of $\sqrt{4^dn_0/m} \|\epsilon\|$. Numerical experiments on synthetic data and real data are provided to validate our theoretical results and to illustrate that the algorithm can effectively recover images from compressive measurements.
Conference Poster: pdf
1 Reply

Loading