Guarantees of Riemannian optimization for low rank matrix recoveryDownload PDF

17 May 2023OpenReview Archive Direct UploadReaders: Everyone
Abstract: We establish theoretical recovery guarantees of a family of Riemannian optimization algorithms for low rank matrix recovery, which is about recovering an $m×n$ rank $r$ matrix from $p<mn$ number of linear measurements. The algorithms are first interpreted as iterative hard thresholding algorithms with subspace projections. Based on this connection, we show that provided the restricted isometry constant $R_{3r}$ of the sensing operator is less than $C_\kappa/\sqrt{r}$, the Riemannian gradient descent algorithm and a restarted variant of the Riemannian conjugate gradient algorithm are guaranteed to converge linearly to the underlying rank $r$ matrix if they are initialized by one step hard thresholding. Empirical evaluation shows that the algorithms are able to recover a low rank matrix from nearly the minimum number of measurements necessary.
0 Replies

Loading