TL;DR: Mutual information estimator based nonextensive statistical mechanics
Abstract: This paper aims to address the limitations of mutual information estimators based on variational optimization. By redefining the cost using generalized functions from nonextensive statistical mechanics we raise the upper bound of previous estimators and enable the control of the bias variance trade off. Variational based estimators outperform previous methods especially in high dependence high dimensional scenarios found in machine learning setups. Despite their performance, these estimators either exhibit a high variance or are upper bounded by log(batch size). Our approach inspired by nonextensive statistical mechanics uses different generalizations for the logarithm and the exponential in the partition function. This enables the estimator to capture changes in mutual information over a wider range of dimensions and correlations of the input variables whereas previous estimators saturate them.
Code: https://github.com/valeriu-balaban/variational-bounds-mi-nonextensive-statistics
Keywords: mutual information, variational bounds, nonextensive statistical mechanics
Original Pdf: pdf
4 Replies
Loading