Simple and Nearly-Optimal Sampling for Rank-1 Tensor Completion via Gauss-Jordan

TMLR Paper5086 Authors

11 Jun 2025 (modified: 30 Jul 2025)Under review for TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: We revisit the sample and computational complexity of the rank-1 tensor completion problem in $\otimes_{i=1}^{N} \mathbb{R}^{d}$, given a uniformly sampled subset of entries. We present a characterization of the problem which reduces to solving a pair of random linear systems. For example, when $N$ is a constant, we prove it requires no more than $m = O(d^2 \log d)$ samples and runtime $O(md^2)$. Moreover, we show that a broad class of algorithms require $\Omega(d\log d)$ samples, even under higher rank scenarios. In contrast, existing upper bounds on the sample complexity are at least as large as $d^{1.5} \mu^{\Omega(1)} \log^{\Omega(1)} d$, where $\mu$ can be $\Theta(d)$ in the worst case. Prior work obtained these looser guarantees in higher rank versions of our problem, and tend to involve more complicated algorithms.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: We wish to thank the reviewers and the editor for their valuable criticism, suggestions, and expediency. Below you will find responses to each point raised by the reviewers. Nearly all of these changes resulted in accommodations in the posted revision. Beyond minor fixes (e.g. typos), this affected only expositional text, with new or heavily-modified portions marked in $\textcolor{blue}{blue}$. For convenience, we summarize the main changes made to the manuscript, as requested by the reviewers: - We've taken great care to augment the emphasis of our contributions, technical challenges, and relationship with the prior work. In particular: (1) we highlight the gaps in the prior literature, addressed by this work, (2) we provide a much deeper explanation of the mismatch between our upper and lower bounds, and substantiate our conjecture, (3) we expand our conclusion section to encompass a thorough and concise summary of our main results, contributions, and challenges. - The "informal results" Theorem caused particular confusion since it summarized several results. We've since replaced it with a summary in the same location, $\textit{outside}$ the Theorem environment. - We moved the "Notation" paragraph earlier in the text, avoiding all nonstandard notation prior. - We streamlined footnotes and incorporated them as much as possible into the main body. - We removed the "Claim" and "Remark" environments to reduce potential visual clutter. All intermediate results are now lemmas, and remarks are in the main body. - We clarified the usage of certain terms originally appearing as, for example, "CPD", "Hadamard matrix", and "complexity class BPP". - We discuss applications and further emphasize existing references for a deeper coverage. - We emphasized the definition of our problem (originally titled "Rank-1 Completion"), moreover changing its name to the better title "Rank-1 Tensor Completion" in reference throughout. - We included a definition of incoherence in the introduction.
Assigned Action Editor: ~Bamdev_Mishra1
Submission Number: 5086
Loading