A Discussion On the Validity of Manifold LearningDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Manifold learning, Dimensionality Reduction, Computational Geometry, Simplicial Complex
Abstract: Dimensionality reduction (DR) and manifold learning (ManL) have been applied extensively in many machine learning tasks, including signal processing, speech recognition, and neuroinformatics. However, the understanding of whether DR and ManL models can generate valid learning results remains unclear. In this work, we investigate the validity of learning results of some widely used DR and ManL methods through the chart mapping function of a manifold. We identify a fundamental problem of these methods: the mapping functions induced by these methods violate the basic settings of manifolds, and hence they are not learning manifold in the mathematical sense. To address this problem, we provide a provably correct algorithm called fixed points Laplacian mapping (FPLM), that has the geometric guarantee to find a valid manifold representation (up to a homeomorphism). Combining one additional condition (orientation preserving), we discuss a sufficient condition for an algorithm to be bijective for any -simplex decomposition result on a -manifold. However, constructing such a mapping function and its computational method satisfying these conditions is still an open problem in mathematics.
4 Replies

Loading