Using Noise to Help Reach Global Minima: Turning Matrix Completion into Noisy Matrix Sensing

12 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: non-convex optimization, low-rank matrix optimization, matrix sensing, implicit bias, tensor, over-parametrization
TL;DR: We provide recovery guarantees for matrix completion problems when observations are scarce via connecting to noisy matrix sensing
Abstract: Matrix completion (MC) is an important yet challenging non-convex problem. In realistic settings, exact recovery of $M^*$ typically requires strong incoherence and an impractically large number of observed entries. Instead of enforcing exact recovery, we inject noise perturbation to construct a closely related surrogate that turns MC into a noisy matrix sensing problem with a more benign landscape. Although this surrogate permits a slight, controllable loss in accuracy, it can be solved effectively via over-parameterization (increasing model size), echoing modern machine learning practices where large models and stochasticity (e.g., SGD, dropout) make hard objectives tractable. Under the assumption that each entry of the matrix is observed independently and uniformly, we establish explicit accuracy–probability trade-offs as functions of the sampling rate $p$ and a user-chosen noise level. Empirically, our approach succeeds in low-observation regimes where classical exact-recovery pipelines are brittle. More broadly, our approach underscores a general paradigm in which noise perturbations are combined with large models to tackle modern ML tasks, and we use MC as a clean benchmark to formalize this perspective and unify noise, over-parameterization, and recoverability within a single framework.
Supplementary Material: zip
Primary Area: optimization
Submission Number: 4245
Loading