Preventing posterior collapse in variational autoencoders for text generation via decoder regularizationDownload PDF

27 Sept 2021, 17:50 (modified: 24 Nov 2021, 13:36)DGMs and Applications @ NeurIPS 2021 PosterReaders: Everyone
Keywords: variational autoencoder, posterior collapse, text generation
TL;DR: We propose a novel regularization method based on fraternal dropout to prevent posterior collapse in VAEs for text generation
Abstract: Variational autoencoders trained to minimize the reconstruction error are sensitive to the posterior collapse problem, that is the proposal posterior distribution is always equal to the prior. We propose a novel regularization method based on fraternal dropout to prevent posterior collapse. We evaluate our approach using several metrics and observe improvements in all the tested configurations.
1 Reply

Loading