Robust Tangent Space Estimation via Laplacian Eigenvector Gradient Orthogonalization

Published: 23 Sept 2025, Last Modified: 21 Oct 2025NPGML OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Tangent space estimation, graph Laplacian, tubular neighborhood, manifold learning, dimensionality estimation
Abstract: Estimating the tangent spaces of a data manifold is a fundamental problem in data analysis. The standard approach, Local Principal Component Analysis (LPCA), struggles in high-noise settings due to a critical trade-off in choosing the neighborhood size. Selecting an optimal size requires prior knowledge of the geometric and noise characteristics of the data that are often unavailable. In this paper, we propose a spectral method, Laplacian Eigenvector Gradient Orthogonalization (LEGO), that utilizes the global structure of the data to guide local tangent space estimation. Instead of relying solely on local neighborhoods, LEGO estimates the tangent space at each data point by orthogonalizing the gradients of low-frequency eigenvectors of the graph Laplacian. We provide theoretical motivation for LEGO with a differential geometric analysis on a tubular neighborhood of a manifold. We show that gradients of low-frequency Laplacian eigenfunctions align closely with the tangent bundle, while an eigenfunction with high gradient in directions orthogonal to the manifold lie deeper in the spectrum. We demonstrate that LEGO yields tangent space estimates that are significantly more robust than those from LPCA, resulting in marked improvements in downstream tasks such as manifold learning, boundary detection, and local intrinsic dimension estimation.
Submission Number: 103
Loading