Compressed Sensing with Approximate Priors via Conditional ResamplingDownload PDF

Oct 23, 2020 (edited Dec 03, 2020)NeurIPS 2020 Workshop Deep Inverse Blind SubmissionReaders: Everyone
  • Keywords: Compressed sensing, deep generative priors, Wasserstein distance, Langevin dynamics, invertible generative models
  • TL;DR: We propose a new measure of complexity for distributions supported on R^n and show that conditional resampling achieves near-optimal recovery guarantees
  • Abstract: We characterize the measurement complexity of compressed sensing of signals drawn from a known prior distribution, even when the support of the prior is the entire space (rather than, say, sparse vectors). We show for Gaussian measurements and \emph{any} prior distribution on the signal, that the conditional resampling estimator achieves near-optimal recovery guarantees. Moreover, this result is robust to model mismatch, as long as the distribution estimate (e.g., from an invertible generative model) is close to the true distribution in Wasserstein distance. We implement the conditional resampling estimator for deep generative priors using Langevin dynamics, and empirically find that it produces accurate estimates with more diversity than MAP.
  • Conference Poster: pdf
0 Replies