Generative Principal Component AnalysisDownload PDF

29 Sept 2021, 00:31 (edited 04 Mar 2022)ICLR 2022 PosterReaders: Everyone
  • Keywords: Principal component analysis, generative models, sparse principal component analysis, projected power methods, optimal statistical rates
  • Abstract: In this paper, we study the problem of principal component analysis with generative modeling assumptions, adopting a general model for the observed matrix that encompasses notable special cases, including spiked matrix recovery and phase retrieval. The key assumption is that the first principal eigenvector lies near the range of an $L$-Lipschitz continuous generative model with bounded $k$-dimensional inputs. We propose a quadratic estimator, and show that it enjoys a statistical rate of order $\sqrt{\frac{k\log L}{m}}$, where $m$ is the number of samples. Moreover, we provide a variant of the classic power method, which projects the calculated data onto the range of the generative model during each iteration. We show that under suitable conditions, this method converges exponentially fast to a point achieving the above-mentioned statistical rate. This rate is conjectured in~\citep{aubin2019spiked,cocola2020nonasymptotic} to be the best possible even when we only restrict to the special case of spiked matrix models. We perform experiments on various image datasets for spiked matrix and phase retrieval models, and illustrate performance gains of our method to the classic power method and the truncated power method devised for sparse principal component analysis.
  • One-sentence Summary: We study the problem of principal component analysis with generative modeling assumptions, and provide a corresponding efficient algorithm with provable guarantees.
9 Replies

Loading