Abstract: We analyze the performance of the least absolute shrinkage and selection operator (Lasso) for the linear model when the number of regressors 𝑁 grows larger keeping the true support size 𝑑 finite, i.e., the ultra-sparse case. The result is based on a novel treatment of the non-rigorous replica method in statistical physics, which has been applied only to problem settings where 𝑁 , 𝑑 and the number of observations 𝑀 tend to infinity at the same rate. Our analysis makes it possible to assess the average performance of Lasso with Gaussian sensing matrices without assumptions on the scaling of 𝑁 and 𝑀 , the noise distribution, and the profile of the true signal. Under mild conditions on the noise distribution, the analysis also offers a lower bound on the sample complexity necessary for partial and perfect support recovery when 𝑀 diverges as 𝑀=𝑂(log𝑁) . The obtained bound for perfect support recovery is a generalization of that given in previous literature, which only considers the case of Gaussian noise and diverging 𝑑 . Extensive numerical experiments strongly support our analysis.
Loading