Simple and Nearly-Optimal Sampling for Rank-1 Tensor Completion via Gauss-Jordan

TMLR Paper5086 Authors

11 Jun 2025 (modified: 21 Jun 2025)Under review for TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: We revisit the sample and computational complexity of the rank-1 tensor completion problem in $\otimes_{i=1}^{N} \mathbb{R}^{d}$, given a uniformly sampled subset of entries. We present a characterization of the problem which reduces to solving a pair of random linear systems. For example, when $N$ is a constant, we prove it requires no more than $m = O(d^2 \log d)$ samples and runtime $O(md^2)$. Moreover, we show that a broad class of algorithms require $\Omega(d\log d)$ samples, even under higher rank scenarios. In contrast, existing upper bounds on the sample complexity are at least as large as $d^{1.5} \mu^{\Omega(1)} \log^{\Omega(1)} d$, where $\mu$ can be $\Theta(d)$ in the worst case. Prior work obtained these looser guarantees in higher rank versions of our problem, and tend to involve more complicated algorithms.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Bamdev_Mishra1
Submission Number: 5086
Loading