Variational lower bounds on mutual information based on nonextensive statistical mechanicsDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Withdrawn SubmissionReaders: Everyone
TL;DR: Mutual information estimator based nonextensive statistical mechanics
Abstract: This paper aims to address the limitations of mutual information estimators based on variational optimization. By redefining the cost using generalized functions from nonextensive statistical mechanics we raise the upper bound of previous estimators and enable the control of the bias variance trade off. Variational based estimators outperform previous methods especially in high dependence high dimensional scenarios found in machine learning setups. Despite their performance, these estimators either exhibit a high variance or are upper bounded by log(batch size). Our approach inspired by nonextensive statistical mechanics uses different generalizations for the logarithm and the exponential in the partition function. This enables the estimator to capture changes in mutual information over a wider range of dimensions and correlations of the input variables whereas previous estimators saturate them.
Code: https://github.com/valeriu-balaban/variational-bounds-mi-nonextensive-statistics
Keywords: mutual information, variational bounds, nonextensive statistical mechanics
Original Pdf: pdf
4 Replies

Loading