Data-Efficient Variational Mutual Information Estimation via Bayesian Self-Consistency

Published: 10 Oct 2024, Last Modified: 06 Nov 2024NeurIPS BDU Workshop 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: variational inference, mutual information, amortized inference, Bayesian experimental design
Abstract: Mutual information (MI) is a central quantity of interest in information theory and machine learning, but estimating it accurately and efficiently remains challenging. In this paper, we propose a novel approach that exploits Bayesian self-consistency to improve the data efficiency of variational MI estimators. Our method incorporates a principled variance penalty that encourages consistency in marginal likelihood estimates, ultimately leading to more accurate MI estimation and posterior approximation with fewer gradient steps. We demonstrate the effectiveness of our method on two tasks: (1) MI estimation for correlated Gaussian distributions; and (2) Bayesian experimental design for the Michaelis-Menten model. Our results demonstrate that our self-consistent estimator converges faster whilst producing higher quality MI and posterior estimates compared to baselines.
Submission Number: 90
Loading