Variational Learning with Disentanglement-PyTorchDownload PDF

Anonymous

15 Nov 2019 (modified: 05 May 2023)NeurIPS 2019 Workshop DC S2 Blind SubmissionReaders: Everyone
Abstract: Unsupervised learning of disentangled representations is an open problem in machine learning. The Disentanglement-PyTorch library is developed to facilitate research, implementation, and testing of new variational algorithms. In this modular library, neural architectures, dimensionality of the latent space, and the training algorithms are fully decoupled, allowing for independent and consistent experiments across variational methods. The library handles the training scheduling, logging, and visualizations of reconstructions and latent space traversals. It also evaluates the encodings based on various disentanglement metrics. The library, so far, includes implementations of the following unsupervised algorithms VAE, Beta-VAE, Factor-VAE, DIP-I-VAE, DIP-II-VAE, Info-VAE, and Beta-TCVAE, as well as conditional approaches such as CVAE and IFCVAE. The library is compatible with the Disentanglement Challenge of NeurIPS 2019, hosted on AICrowd and was used to compete in the first and second stages of the challenge, where it was ranked among the best few participants.
Keywords: Disentanglement, PyTorch, Representation Learning, Total Correlation, Factorization
TL;DR: Disentanglement-PyTorch is a library for variational representation learning
0 Replies

Loading