Abstract: Tensors play a crucial role in numerous scientific and engineering fields. This paper addresses the low-multilinear-rank tensor completion problem, a fundamental task in tensor-related applications. By exploiting the manifold structure inherent to the fixed-multilinear-rank tensor set, we introduce a simple yet highly effective preconditioned Riemannian metric and propose the Preconditioned Riemannian Gradient Descent (PRGD) algorithm. Compared to the standard Riemannian Gradient Descent (RGD), PRGD achieves faster convergence while maintaining the same order of per-iteration computational complexity. Theoretically, we provide the recovery guarantee for PRGD under near-optimal sampling complexity. Numerical results highlight the efficiency of PRGD, outperforming state-of-the-art methods on both synthetic data and real-world video inpainting tasks.
Lay Summary: **Problems:**
* Many real-world tasks - from Netflix recommendations to restoring damaged photos and vedios - rely on reconstructing missing data from partial observations, a task called low-multilinear-rank tensor completion.
* While people can treat this as optimization on curved surfaces (smooth manifold of low-multilinear-rank tensor set), existing methods use an oversimplified "ruler" (Riemannian metric) that ignores the data's natural geometry.
**Solutions:**
* We design a "smarter ruler", that automatically adapts to the lanscape of the optimization problem.
* Using this, we propose the "Preconditioned Riemannian Gradient Descent" (PRGD) algorithm. Think of it like giving a mountain climber a topographic map instead of assuming all slopes are equally steep.
**Impact:**
* **Speed:** PRGD converges over 10× faster than standard methods (like RGD) with marginal extra computational cost per step.
* **Theory:** We prove PRGD recovers missing data reliably, even with very few observed entries (near-optimal sampling complexity).
* **Applications:** From video and image restorations, PRGD improves real-world performance while maintaining mathematical guarantees.
Link To Code: https://github.com/Jiushanqing-0418/PRGD-Tucker
Primary Area: Optimization->Non-Convex
Keywords: Tensor completion, Multilinear rank, Riemannian gradient descent, Preconditioning
Submission Number: 9504
Loading