Optimal Compressive Covariance Sketching via Rank-One Sampling

Published: 25 Mar 2025, Last Modified: 20 May 2025SampTA 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Session: General
Keywords: Compressive covariance sketching, rank-1 Gaussian measurements, quadratic sampling model, nonconvex penalty, majorization-minimization, sparsity
Abstract: In this paper, we study the problem of compressive covariance sketching, where the goal is to compress a high-dimensional data stream and recover its covariance matrix from a limited number of compressed measurements. This problem is particularly relevant in scenarios where the data evolves rapidly or where sensing devices are constrained by limited computational and storage resources. We consider the rank-one sampling model under the assumption that the underlying covariance matrix is sparse. To estimate the covariance matrix, we propose a regularized least-squares estimator that incorporates nonconvex sparsity-inducing penalties. To compute the estimator efficiently, we develop a multi-stage convex relaxation algorithm based on the majorization-minimization (MM) framework. Each subproblem in the MM scheme is approximately solved via a proximal Newton method, which enjoys a locally quadratic convergence rate. We establish that the proposed estimator achieves the oracle statistical convergence rate after a sufficient number of iterations. Numerical experiments corroborate our theoretical findings and demonstrate the effectiveness of the proposed approach.
Submission Number: 4
Loading