A Bayesian Bootstrap Framework for Mutual Information Neural Estimation: Bridging Classical Mutual Information Learning and Bayesian Nonparametric Learning

Published: 02 Mar 2026, Last Modified: 02 Mar 2026Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: In this work, we introduce a Bayesian bootstrap resampling framework for estimating mutual information (MI) via ``mutual information neural estimation'' (MINE), making MINE directly applicable in a Bayesian nonparametric learning (BNPL) framework. The resulting estimator shows low variability across batch sizes and high-dimensional settings, as demonstrated through extensive numerical studies. In particular, our proposed bootstrap version yields tighter and lower-variance estimates than the original MINE formulation, both theoretically and empirically. We further demonstrate its practical value in a downstream task by improving VAE-GAN training within BNPL, leading to higher-quality outputs. Beyond enabling MI-based BNPL, the proposed bootstrap estimator also performs competitively against leading frequentist state-of-the-art benchmarks. Overall, our findings establish the first principled framework for Bayesian bootstrap-based MI estimation and highlight its effectiveness as a reliable tool for future BNPL studies.
Submission Length: Long submission (more than 12 pages of main content)
Assigned Action Editor: ~Francisco_J._R._Ruiz1
Submission Number: 6101
Loading