Keywords: Bayesian inference, natural gradient, variational inference, Riemannian gradient, Free Energy
TL;DR: Riemannian Black Box Variational Inference is a new gradient-free variational inference method offering competitive performance with fewer black-box function evaluations.
Abstract: We introduce Riemannian Black Box Variational Inference (RBBVI) for scenarios lacking gradient information of the model with respect to its parameters. Our method constrains posterior marginals to exponential families, optimizing variational free energy using Riemannian geometry and gradients of the log-partition function. It excels with black-box or nondifferentiable models, where popular methods fail. We demonstrate efficacy by inferring parameters from the SIR model and tuning neural network learning rates. The results show competitive performance with gradient-based (NUTS) and gradient-free (Latent Slice Sampling) methods, achieving better coverage and matching Bayesian optimization with fewer evaluations. RBBVI extends variational inference to settings where model gradients are unavailable, improving efficiency and flexibility for real-world applications.
Submission Number: 71
Loading