Conditional Sampling of Variational Autoencoders via Iterated Approximate Ancestral Sampling

Published: 08 Nov 2023, Last Modified: 08 Nov 2023Accepted by TMLREveryoneRevisionsBibTeX
Abstract: Conditional sampling of variational autoencoders (VAEs) is needed in various applications, such as missing data imputation, but is computationally intractable. A principled choice for asymptotically exact conditional sampling is Metropolis-within-Gibbs (MWG). However, we observe that the tendency of VAEs to learn a structured latent space, a commonly desired property, can cause the MWG sampler to get “stuck” far from the target distribution. This paper mitigates the limitations of MWG: we systematically outline the pitfalls in the context of VAEs, propose two original methods that address these pitfalls, and demonstrate an improved performance of the proposed methods on a set of sampling tasks.
Submission Length: Regular submission (no more than 12 pages of main content)
Code: https://github.com/vsimkus/vae-conditional-sampling
Assigned Action Editor: ~Yingzhen_Li1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 1482
Loading