Hyperboloid GPLVM for Discovering Continuous Hierarchies via Nonparametric Estimation
Abstract: Dimensionality reduction (DR) offers interpretable representations of complex high-dimensional data, and recent DR methods have leveraged hyperbolic geometry to obtain faithful low-dimensional embeddings of high-dimensional hierarchical relationships. However, existing methods are dependent on neighbor embedding, which frequently ruins the continuous nature of the hierarchical structures. This paper proposes hyperboloid Gaussian process latent variable models (hGP-LVMs) to embed high-dimensional hierarchical data while preserving the implicit continuity via nonparametric estimation. We adopt generative modeling using the GP, which provides effective hierarchical embedding and executes ill-posed hyperparameter tuning. This paper presents three variants of the proposed models that employ original point, sparse point, and Bayesian estimations, and we establish their learning algorithms by incorporating the Riemannian optimization and active approximation scheme of the GP-LVM. In addition, we employ the reparameterization trick for scalable learning of the latent variables in the Bayesian estimation method. The proposed hGP-LVMs were applied to several datasets, and the results demonstrate their ability to represent high-dimensional hierarchies in low-dimensional spaces.
Submission Number: 608
Loading