Abstract: Deep hierarchical variational autoencoders (VAEs) are powerful latent variable generative models.
In this paper, we introduce Hierarchical VAE with Diffusion-based Variational Mixture of the Posterior Prior (VampPrior).
We apply amortization to scale the VampPrior to models with many stochastic layers.
The proposed approach allows us to achieve better performance compared to the original VampPrior work and other deep hierarchical VAEs, while using fewer parameters.
We empirically validate our method on standard benchmark datasets (MNIST, OMNIGLOT, CIFAR10) and demonstrate improved training stability and latent space utilization.
Submission Length: Regular submission (no more than 12 pages of main content)
Code: https://github.com/AKuzina/dvp_vae
Assigned Action Editor: ~Arto_Klami1
Submission Number: 3155
Loading