On Kernel based Variational Autoencoders

Published: 03 Feb 2026, Last Modified: 03 Feb 2026AISTATS 2026 PosterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: This paper proposed a variant of VAE with KDE approximated posterior
Abstract: In this paper, we bridge Variational Autoencoders (VAEs) and kernel density estimations (KDEs) by approximating the posterior by the expectation of kernel density estimator and deriving a new lower bound of empirical log likelihood. The flexibility of KDEs provides a new perspective of controlling the KL-divergence term in original evidence lower bound (ELBO) which enriches the choice of the posterior and prior pairs in VAE. We show that the Epanechnikov kernel gives the tightest upper bound in controlling the KL-divergence under appropriate conditions in theory and develop a kernel-based VAE called Epanechnikov Variational Autoenocoder (EVAE). The implementation of EVAE is straightforward as Epanechnikov kernel lies in the ``location-scale'' family of distributions where reparametrization tricks can be applied directly. Compared with Gaussian kernel, Epanechnikov kernel has compact support which should make the generated sample less blurry. The flexibility of new lower bound of ELBO also enables us to employ a two-stage training strategy to treat reconstruction and generation separately, which is an analogue of the idea in VQ-VAE. Extensive experiments illustrate the potential of EVAE in image generation and the superiority of EVAE over vanilla VAE and other baseline models in the quality of reconstructed images, as measured by the FID score and Sharpness. We also carried out additional experiments about the application of EVAE in downstream classification tasks.
Submission Number: 681
Loading