INFORMATION MAXIMIZATION AUTO-ENCODING

Dejiao Zhang, Tianchen Zhao, Laura Balzano

Sep 27, 2018 ICLR 2019 Conference Blind Submission readers: everyone Show Bibtex
  • Abstract: We propose the Information Maximization Autoencoder (IMAE), an information theoretic approach to simultaneously learn continuous and discrete representations in an unsupervised setting. Unlike the Variational Autoencoder framework, IMAE starts from a stochastic encoder that seeks to map each input data to a hybrid discrete and continuous representation with the objective of maximizing the mutual information between the data and their representations. A decoder is included to approximate the posterior distribution of the data given their representations, where a high fidelity approximation can be achieved by leveraging the informative representations. We show that the proposed objective is theoretically valid and provides a principled framework for understanding the tradeoffs regarding informativeness of each representation factor, disentanglement of representations, and decoding quality.
  • Keywords: Information maximization, unsupervised learning of hybrid of discrete and continuous representations
  • TL;DR: Information theoretical approach for unsupervised learning of unsupervised learning of a hybrid of discrete and continuous representations,
0 Replies

Loading