OMIE: The Online Mutual Information Estimator

Anonymous

Nov 07, 2017 (modified: Nov 07, 2017) ICLR 2018 Conference Blind Submission readers: everyone Show Bibtex
  • Abstract: This paper presents OMIE, an Online Mutual Information Estimator(OMIE) that is linearly scalable both on the dimension of the data as well as sample size. OMIE is back-propable and we prove that it is strongly consistent. We apply the estimation principle underlying the mutual information estimator to general dependency f-divergence as well as integral probability metrics dependency measures. We illustrate a handful of applications in which OMIE is succesfully applied to enhance the property of generative models in both unsupervised and supervised settings. We apply our framework to estimate the information bottleneck and apply it in tasks related to supervised classification problem and show that there is substantial added flexibility and improvement in these settings
  • TL;DR: A scalable in sample size and dimensions mutual information estimator.
  • Keywords: Deep Learning, Neural Networks, Information Theory, Generative models

Loading