Maximum Variance Unfolding on Disjoint Manifolds

ICLR 2026 Conference Submission19982 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: nonlinear dimensionality reduction; maximum variance unfolding; neighborhood graph; disjoint manifolds; manifold learning
TL;DR: We extend maximum variance unfolding to the case of disconnected neighbourhood graphs
Abstract: An assumption underlying much of machine learning is that observed data are often sampled from a manifold of much lower dimension than the data space itself. While linear methods such as PCA can often be used to perform dimensionality reduction, they fail to capture nonlinear relationships in the data, which are often present in natural datasets. Maximum variance unfolding is an established and well-studied neighborhood graph-based method for nonlinear dimensionality reduction with the unique property of retaining exact local isometry. However, its applicability on real-world data is limited due to its dependence on the connectivity of the underlying neighborhood graph: in natural datasets, data are often multimodal and lie on disjoint manifolds, giving rise to clusters of points that are distant in the data space. In this work, we present a method that extends maximum variance unfolding to the common case where data lie on disjoint manifolds. We show that it decreases both computation time and memory requirements, and that it improves performance in standard metrics that assess the extent to which the local structure of the data is preserved.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 19982
Loading