Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
OMIE: The Online Mutual Information Estimator
Nov 07, 2017 (modified: Nov 07, 2017)ICLR 2018 Conference Blind Submissionreaders: everyoneShow Bibtex
Abstract:This paper presents OMIE, an Online Mutual Information Estimator(OMIE) that
is linearly scalable both on the dimension of the data as well as sample size. OMIE
is back-propable and we prove that it is strongly consistent. We apply the estimation
principle underlying the mutual information estimator to general dependency
f-divergence as well as integral probability metrics dependency measures. We illustrate a
handful of applications in which OMIE is succesfully applied to enhance
the property of generative models in both unsupervised and supervised settings.
We apply our framework to estimate the information bottleneck and apply it in
tasks related to supervised classification problem and show that there is substantial
added flexibility and improvement in these settings
TL;DR:A scalable in sample size and dimensions mutual information estimator.
Keywords:Deep Learning, Neural Networks, Information Theory, Generative models
Enter your feedback below and we'll get back to you as soon as possible.