$\gamma$-Orthogonalized Tensor Deflation: Towards Robust \& Interpretable Tensor Decomposition in the Presence of Correlated Components

24 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: learning theory
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Low-rank signal reconstruction, tensor decomposition, random matrix theory, optimization.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: We tackle the problem of recovering a low-rank tensor signal with possibly correlated components from a random noisy tensor, or the so-called \textit{spiked tensor model}. When the underlying components are orthogonal, they can be recovered efficiently using \textit{tensor deflation}, while correlated components may alter the tensor deflation mechanism, thereby preventing efficient recovery. Relying on recently developed tools from random tensor theory, we deal precisely with the non-orthogonal case by deriving an asymptotic analysis of a \textit{parameterized} deflation procedure, which we refer to as $\gamma$-orthogonalized tensor deflation. Based on this analysis, an efficient tensor deflation algorithm is proposed by optimizing the parameter injected into the deflation mechanism, which in turn is proven to be optimal by construction for the studied tensor model. We perform a detailed theoretical and algorithmic analysis on the rank-2 order-3 model, and outline a general structure to tackle the problem in more generality for arbitrary ranks/orders, aiming to lead to a broader impact in machine learning and beyond.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9131
Loading