Denise: Deep Robust Principal Component Analysis for Positive Semidefinite Matrices

Published: 26 May 2023, Last Modified: 26 May 2023Accepted by TMLREveryoneRevisionsBibTeX
Abstract: The robust PCA of covariance matrices plays an essential role when isolating key explanatory features. The currently available methods for performing such a low-rank plus sparse decomposition are matrix specific, meaning, those algorithms must re-run for every new matrix. Since these algorithms are computationally expensive, it is preferable to learn and store a function that nearly instantaneously performs this decomposition when evaluated. Therefore, we introduce Denise, a deep learning-based algorithm for robust PCA of covariance matrices, or more generally, of symmetric positive semidefinite matrices, which learns precisely such a function. Theoretical guarantees for Denise are provided. These include a novel universal approximation theorem adapted to our geometric deep learning problem and convergence to an optimal solution to the learning problem. Our experiments show that Denise matches state-of-the-art performance in terms of decomposition quality, while being approximately $2000\times$ faster than the state-of-the-art, principal component pursuit (PCP), and $200 \times$ faster than the current speed-optimized method, fast PCP.
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Length: Long submission (more than 12 pages of main content)
Previous TMLR Submission Url:
Changes Since Last Submission: We want to thank the referees and the action editor for their helpful feedback and for reviewing our manuscript in detail. We very much appreciate your hard work and the quality of the reviews, and we believe it really helped improve the quality of our submission. Thanks a lot, The authors.
Assigned Action Editor: ~Stephen_Becker1
Submission Number: 833