Lipschitz regularized gradient flows and latent generative particles

Published: 01 Jan 2022, Last Modified: 23 May 2024CoRR 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: We build a new class of generative algorithms capable of efficiently learning an arbitrary target distribution from possibly scarce, high-dimensional data and subsequently generate new samples. These generative algorithms are particle-based and are constructed as gradient flows of Lipschitz-regularized Kullback-Leibler or other $f$-divergences, where data from a source distribution can be stably transported as particles, towards the vicinity of the target distribution. As a highlighted result in data integration, we demonstrate that the proposed algorithms correctly transport gene expression data points with dimension exceeding 54K, while the sample size is typically only in the hundreds.
Loading