Space lower bounds for linear prediction

Published: 01 Jan 2019, Last Modified: 07 Oct 2024CoRR 2019EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: We show that fundamental learning tasks, such as finding an approximate linear separator or linear regression, require memory at least \emph{quadratic} in the dimension, in a natural streaming setting. This implies that such problems cannot be solved (at least in this setting) by scalable memory-efficient streaming algorithms. Our results build on a memory lower bound for a simple linear-algebraic problem -- finding orthogonal vectors -- and utilize the estimates on the packing of the Grassmannian, the manifold of all linear subspaces of fixed dimension.
Loading