Recovering Simultaneously Structured Data via Non-Convex Iteratively Reweighted Least Squares

Published: 21 Sept 2023, Last Modified: 02 Nov 2023NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: low-rank models, sparsity, iteratively reweighted least squares, non-convex optimization, quadratic convergence, simultaneously structured data
TL;DR: We propose a novel iteratively reweighted least squares method for the recovery of simultaneously structured data from measurements, and show its local quadratic convergence in a minimal sample complexity regime.
Abstract: We propose a new algorithm for the problem of recovering data that adheres to multiple, heterogenous low-dimensional structures from linear observations. Focussing on data matrices that are simultaneously row-sparse and low-rank, we propose and analyze an iteratively reweighted least squares (IRLS) algorithm that is able to leverage both structures. In particular, it optimizes a combination of non-convex surrogates for row-sparsity and rank, a balancing of which is built into the algorithm. We prove locally quadratic convergence of the iterates to a simultaneously structured data matrix in a regime of minimal sample complexity (up to constants and a logarithmic factor), which is known to be impossible for a combination of convex surrogates. In experiments, we show that the IRLS method exhibits favorable empirical convergence, identifying simultaneously row-sparse and low-rank matrices from fewer measurements than state-of-the-art methods.
Supplementary Material: zip
Submission Number: 14765
Loading