Out-of-Sample Extension of Spectral Embeddings: An Optimization Perspective

Published: 16 Nov 2024, Last Modified: 26 Nov 2024LoG 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Manifold learning, Riemannian optimization
TL;DR: We propose a new framework for out-of-sample extensions for spectral graph-based embedding algorithms motivated from the perspective of the underlying eigenvector problem as a quadratic program on the Stiefel Manifold.
Abstract: Graph-based manifold learning constructs / reveals low dimensional embeddings of high-dimensional data, however requires out-of-sample-extension methods to embed new data points. We propose a new framework, ROSE (Riemannian Out-of-Sample Extension), for out-of-sample extensions for spectral graph-based embedding algorithms. ROSE is motivated from an optimization perspective of the underlying eigenvector problem associated with classic manifold learning problems. Similar to graph-based semi-supervised learning, our approach exploits the geometry of the unlabeled samples in addition to the labeled samples, by treating the in-sample embeddings as \emph{labeled} data. Despite its nonconvexity, ROSE is solvable by first-order methods, which converge to global minimizers under certain assumptions. We present numerical experiments on a variety of real-world and synthetic benchmarks. Empirically, we show on standard image datasets (MNIST, FMNIST, CIFAR-10) that ROSE closely approximates the ground-truth eigenvectors in low and high label rate regimes with respect to neighborhood preservation.
Submission Type: Extended abstract (max 4 main pages).
Poster: png
Poster Preview: png
Submission Number: 155
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview