Exponentially Convergent Algorithms for Supervised Matrix Factorization

Published: 21 Sept 2023, Last Modified: 27 Dec 2023NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: Supervised matrix factorization, multi-objective optimization, global convergence, linear convergence, statistical estimation
TL;DR: Supervised matrix factorization, a multi-objective non-convex problem, is lifted to a low-rank matrix estimation problem and is solved by using projected low-rank gradient descent algorithms. Exponential convergence to global optima is established.
Abstract: Supervised matrix factorization (SMF) is a classical machine learning method that simultaneously seeks feature extraction and classification tasks, which are not necessarily a priori aligned objectives. Our goal is to use SMF to learn low-rank latent factors that offer interpretable, data-reconstructive, and class-discriminative features, addressing challenges posed by high-dimensional data. Training SMF model involves solving a nonconvex and possibly constrained optimization with at least three blocks of parameters. Known algorithms are either heuristic or provide weak convergence guarantees for special cases. In this paper, we provide a novel framework that `lifts' SMF as a low-rank matrix estimation problem in a combined factor space and propose an efficient algorithm that provably converges exponentially fast to a global minimizer of the objective with arbitrary initialization under mild assumptions. Our framework applies to a wide range of SMF-type problems for multi-class classification with auxiliary features. To showcase an application, we demonstrate that our algorithm successfully identified well-known cancer-associated gene groups for various cancers.
Supplementary Material: pdf
Submission Number: 14194
Loading