Universal Joint Approximation of Manifolds and Densities by Simple Injective FlowsDownload PDF

Published: 28 Jan 2022, Last Modified: 13 Feb 2023ICLR 2022 SubmittedReaders: Everyone
Keywords: Universality, Flow Networks, Manifold Learning, Density Estimation
Abstract: We analyze neural networks composed of bijective flows and injective expansive elements. We find that such networks universally approximate a large class of manifolds simultaneously with densities supported on them. Among others, our results apply to the well-known coupling and autoregressive flows. We build on the work of Teshima et al. 2020 on bijective flows and study injective architectures proposed in Brehmer et al. 2020 and Kothari et al. 2021. Our results leverage a new theoretical device called the \emph{embedding gap}, which measures how far one continuous manifold is from embedding another. We relate the embedding gap to a relaxation of universally we call the \emph{manifold embedding property}, capturing the geometric part of universality. Our proof also establishes that optimality of a network can be established ``in reverse,'' resolving a conjecture made in Brehmer et al. 2020 and opening the door for simple layer-wise training schemes. Finally, we show that the studied networks admit an exact layer-wise projection result, Bayesian uncertainty quantification, and black-box recovery of network weights.
One-sentence Summary: We analyze neural networks composed of bijective flows and injective expansive elements and find that such networks universally approximate a large class of manifolds and densities there on.
22 Replies

Loading