Direct Optimization through $\arg \max$ for Discrete Variational Auto-EncoderDownload PDF

27 Sept 2018 (modified: 05 May 2023)ICLR 2019 Conference Blind SubmissionReaders: Everyone
Abstract: Reparameterization of variational auto-encoders is an effective method for reducing the variance of their gradient estimates. However, when the latent variables are discrete, a reparameterization is problematic due to discontinuities in the discrete space. In this work, we extend the direct loss minimization technique to discrete variational auto-encoders. We first reparameterize a discrete random variable using the $\arg \max$ function of the Gumbel-Max perturbation model. We then use direct optimization to propagate gradients through the non-differentiable $\arg \max$ using two perturbed $\arg \max$ operations.
Keywords: discrete variational auto encoders, generative models, perturbation models
Code: [![github](/images/github_icon.svg) GuyLor/direct_vae](https://github.com/GuyLor/direct_vae) + [![Papers with Code](/images/pwc_icon.svg) 1 community implementation](https://paperswithcode.com/paper/?openreview=S1ey2sRcYQ)
Data: [CelebA](https://paperswithcode.com/dataset/celeba), [Fashion-MNIST](https://paperswithcode.com/dataset/fashion-mnist)
10 Replies

Loading