Keywords: Multi-Index Models, Low-Degree Polynomials, Computational-Statistical Gaps
TL;DR: We define a "generative leap" exponent which tightly captures the sample complexity for efficiently learning Gaussian multi-index models.
Abstract: In this work we consider generic Gaussian Multi-index models, in which the labels only depend on the (Gaussian) $d$-dimensional inputs through their projection onto a low-dimensional $r = O_d(1)$ subspace, and we study efficient agnostic estimation procedures for this hidden subspace. We introduce the *generative leap* exponent, a natural extension of the generative exponent from Damian et al. 2024 to the multi-index setting. We show that a sample complexity of $n=\Theta(d^{1 \vee k^\star/2})$ is necessary in the class of algorithms captured by the Low-Degree-Polynomial framework; and also sufficient, by giving a sequential estimation procedure based on a spectral U-statistic over appropriate Hermite tensors.
Supplementary Material: zip
Primary Area: Theory (e.g., control theory, learning theory, algorithmic game theory)
Submission Number: 13696
Loading