Query Efficient Structured Matrix Approximation

Published: 28 Jun 2025, Last Modified: 28 Jun 2025TASC 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: structured matrix approximation, matrix-vector prodcuts, operator learning
Abstract: We study the problem of learning a structured approximation (low-rank, sparse, banded, etc.) to an unknown matrix $\mathbf{A}$ given access to matrix-vector product queries of the form $\mathbf{x} \rightarrow \mathbf{A}\mathbf{x}$ and $\mathbf{x} \rightarrow \mathbf{A}^\mathsf{T}\mathbf{x}$. Among many other applications at the intersection of machine learning and scientific computing, this problem arises as a natural abstraction of the operator learning problem in Scientific Machine Learning [Boullé, Townsend, FoCM 2023]. We take a step towards understanding the sample complexity of structured matrix approximation by proving matching upper and lower bounds for the number of queries needed to find a near-optimal approximation to $\mathbf{A}$ from any structured family, $\mathcal{F}$, of finite size. In particular, we show that $\Theta(\sqrt{\log|\mathcal{F}|})$ queries always suffice, and are necessary in the worst case. The upper bound improves on the natural baseline of $O(\log|\mathcal{F}|)$ queries. In this workshop submission, we provide a self-contained proof of this result in the simplified realizable setting, demonstrating the key ideas and techniques used in forthcoming work to prove the general version.
Submission Number: 7
Loading