Keywords: Streaming PCA, Low-Precision Quantization, Oja’s Algorithm, Stochastic Quantization, Finite-Sample Convergence, Spectral Gap Assumption, Minimax Lower Bounds
TL;DR: Low-precision quantized variants of Oja’s algorithm can provably converge in streaming PCA under suitably fine discretization while drastically reducing memory and compute requirements.
Abstract: Low-precision Streaming PCA estimates the top principal component in a streaming setting under limited precision. We establish an information‐theoretic lower bound on the quantization resolution required to achieve a target accuracy for the leading eigenvector. We study Oja's algorithm for streaming PCA under linear and nonlinear stochastic quantization. The quantized variants use unbiased stochastic quantization of the weight vector and the updates. Under mild moment and spectral-gap assumptions on the data distribution, we show that a batched version achieves the lower bound up to logarithmic factors under both schemes. This leads to a nearly _dimension-free_ quantization error in the nonlinear quantization setting. Empirical evaluations on synthetic streams validate our theoretical findings and demonstrate that our low-precision methods closely track the performance of standard Oja’s algorithm.
Supplementary Material: zip
Primary Area: Theory (e.g., control theory, learning theory, algorithmic game theory)
Submission Number: 23091
Loading