Abstract: New upper bounds are developed for the L2 distance between ξ/Var[ξ]^{1/2} and linear and quadratic functions of z∼N(0,I_n) for random variables of the form ξ=z^Tf(z)−div f(z). The linear approximation yields a central limit theorem when the squared norm of f(z) dominates the squared Frobenius norm of ∇f(z) in expectation. Applications of this normal approximation are given for the asymptotic normality of de-biased estimators in linear regression with correlated design and convex penalty in the regime p/n≤γ for constant γ∈(0,∞). For the estimation of linear functions ⟨a_0,β⟩ of the unknown coefficient vector β, this analysis leads to asymptotic normality of the de-biased estimate for most normalized directions a_0, where ``most'' is quantified in a precise sense. This asymptotic normality holds for any convex penalty if γ<1 and for any strongly convex penalty if γ≥1. In particular the penalty needs not be separable or permutation invariant. By allowing arbitrary regularizers, the results vastly broaden the scope of applicability of de-biasing methodologies to obtain confidence intervals in high-dimensions. In the absence of strong convexity for p>n, asymptotic normality of the de-biased estimate is obtained for the Lasso and the group Lasso under additional conditions. For general convex penalties, our analysis also provides prediction and estimation error bounds of independent interest.
0 Replies
Loading