Lower Bounds on Randomly Preconditioned Lasso via Robust Sparse DesignsDownload PDF

Published: 31 Oct 2022, 18:00, Last Modified: 10 Oct 2022, 18:54NeurIPS 2022 AcceptReaders: Everyone
Keywords: sparse linear regression, statistical/computational gaps, compressed sensing with adversarial erasure, preconditioning
TL;DR: We construct an ill-conditioned Gaussian SLR task where Lasso with randomized preconditioning provably fails, based on a new connection to erasure-robustness.
Abstract: Sparse linear regression with ill-conditioned Gaussian random covariates is widely believed to exhibit a statistical/computational gap, but there is surprisingly little formal evidence for this belief. Recent work has shown that, for certain covariance matrices, the broad class of Preconditioned Lasso programs provably cannot succeed on polylogarithmically sparse signals with a sublinear number of samples. However, this lower bound only holds against deterministic preconditioners, and in many contexts randomization is crucial to the success of preconditioners. We prove a stronger lower bound that rules out randomized preconditioners. For an appropriate covariance matrix, we construct a single signal distribution on which any invertibly-preconditioned Lasso program fails with high probability, unless it receives a linear number of samples. Surprisingly, at the heart of our lower bound is a new robustness result in compressed sensing. In particular, we study recovering a sparse signal when a few measurements can be erased adversarially. To our knowledge, this natural question has not been studied before for sparse measurements. We surprisingly show that standard sparse Bernoulli measurements are almost-optimally robust to adversarial erasures: if $b$ measurements are erased, then all but $O(b)$ of the coordinates of the signal are identifiable.
11 Replies