INFORMATION MAXIMIZATION AUTO-ENCODINGDownload PDF

27 Sept 2018 (modified: 05 May 2023)ICLR 2019 Conference Blind SubmissionReaders: Everyone
Abstract: We propose the Information Maximization Autoencoder (IMAE), an information theoretic approach to simultaneously learn continuous and discrete representations in an unsupervised setting. Unlike the Variational Autoencoder framework, IMAE starts from a stochastic encoder that seeks to map each input data to a hybrid discrete and continuous representation with the objective of maximizing the mutual information between the data and their representations. A decoder is included to approximate the posterior distribution of the data given their representations, where a high fidelity approximation can be achieved by leveraging the informative representations. We show that the proposed objective is theoretically valid and provides a principled framework for understanding the tradeoffs regarding informativeness of each representation factor, disentanglement of representations, and decoding quality.
Keywords: Information maximization, unsupervised learning of hybrid of discrete and continuous representations
TL;DR: Information theoretical approach for unsupervised learning of unsupervised learning of a hybrid of discrete and continuous representations,
Data: [Fashion-MNIST](https://paperswithcode.com/dataset/fashion-mnist), [dSprites](https://paperswithcode.com/dataset/dsprites)
21 Replies

Loading