Abstract: Posterior collapse is a phenomenon that occurs when the posterior distribution degenerates to the prior, leading to a decline in the quality of latent encodings and generative models. While it is known to occur in Variational Autoencoders (VAEs), it is unknown whether it occurs in Variational Gradient Origin Networks (VGONs). The goal of this paper is to compare the posterior collapse of Variational Gradient Origin Networks and Variational Autoencoders. By checking the latent encodings of VGONs against the key posterior collapse metrics, our experiments reveal that VGONs do exhibit posterior collapse both in the decline of the Kullback-Leibler divergence (KLD) and the collapse of individual variables. Furthermore, the results show that VGONs and VAEs have a similar polarized regime, suggesting that the cause of posterior collapse is not specific to the architecture of the model used to find an encoding. These findings support the claim made in previous research that posterior collapse is a general issue that affects a wide range of latent variable models.
Loading