Simple, unified analysis of Johnson-Lindenstrauss with applications

Published: 16 Jun 2024, Last Modified: 19 Jul 2024HiLD at ICML 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Dimensionality reduction, Johnson-Lindenstrauss, Hanson-Wright, Matrix factorization, Uncertainty estimation, Epistemic Neural Networks (ENN), Hypermodel
TL;DR: We simplify and unify the Johnson-Lindenstrauss lemma, covering spherical, binary-coin, sparse JL, Gaussian, and sub-Gaussian models. Our approach proves the spherical construction's effectiveness and introduces new sub-Gaussian constructions.
Abstract: We present a simplified and unified analysis of the Johnson-Lindenstrauss (JL) lemma, a cornerstone of dimensionality reduction for managing high-dimensional data. Our approach simplifies understanding and unifies various constructions under the JL framework, including spherical, binary-coin, sparse JL, Gaussian, and sub-Gaussian models. This unification preserves the intrinsic geometry of data, essential for applications from streaming algorithms to reinforcement learning. We provide the first rigorous proof of the spherical construction's effectiveness and introduce a general class of sub-Gaussian constructions within this simplified framework. Central to our contribution is an innovative extension of the Hanson-Wright inequality to high dimensions, complete with explicit constants. By using simple yet powerful probabilistic tools and analytical techniques, such as an enhanced diagonalization process, our analysis solidifies the theoretical foundation of the JL lemma by removing an independence assumption and extends its practical applicability to contemporary algorithms.
Student Paper: Yes
Submission Number: 14
Loading