Second order Poincaré inequalities and de-biasing arbitrary convex regularizers when p/n→γ
Abstract: A new Central Limit Theorem (CLT) is developed for random variables of the form ξ=z⊤f(z)−divf(z) where z∼N(0,In). The normal approximation is proved to hold when the squared norm of f(z) dominates the squared Frobenius norm of ∇f(z) in expectation.
Applications of this CLT are given for the asymptotic normality of de-biased estimators in linear regression with correlated design and convex penalty in the regime p/n→γ∈(0,∞). For the estimation of linear functions ⟨a0,β⟩ of the unknown coefficient vector β, this analysis leads to asymptotic normality of the de-biased estimate for most normalized directions a0, where "most" is quantified in a precise sense. This asymptotic normality holds for any coercive convex penalty if γ<1 and for any strongly convex penalty if γ≥1. In particular the penalty needs not be separable or permutation invariant. For the group Lasso, a simple condition is given that grants asymptotic normality for a fixed direction a0. By allowing arbitrary regularizers, the results vastly broaden the scope of applicability of de-biasing methodologies to obtain confidence intervals in high-dimensions.
In the absence of strong convexity for p > n, asymptotic normality of the de-biased estimate is obtained under additional conditions that are naturally satisfied by the Lasso and the group Lasso.
0 Replies
Loading