Strictly Low Rank Constraint Optimization \\ --- An Asymptotically $\mathcal{O}(\frac{1}{t^2})$ Method

Published: 20 Jun 2023, Last Modified: 11 Oct 2023SODS 2023 PosterEveryoneRevisionsBibTeX
Keywords: Sparsity, non-convex, non-smooth, support set
Abstract: We study a class of non-convex and non-smooth problems with rank regularization to induce sparsity in solutions. We propose to apply the proximal gradient descent method to solve the problem and accelerate the process with a novel support set projection operation operated on the singular values of the obtained solution. We show that our algorithms are able to achieve a convergence rate of $O(\frac{1}{t^2})$, which is Nesterov's optimal convergence rate of first-order methods on smooth and convex problems, and the same convergence rate of regular accelerated proximal gradient descent method on convex problems. Also, the obtained solutions all have great sparsity and the support set of singular values of each obtained solution is shrinking during the update, which improves the interpretability of the algorithms.
Submission Number: 15
Loading