Keywords: Variational inference, stochastic algorithms, asymptotic analysis, alpha divergence, exponential models
TL;DR: We provide asymptotic results for algorithms optimizing the alpha-divergence criterion in the context of Variational Inference, using an exponential variational family.
Abstract: Recent works in Variational Inference have examined alternative criteria to the commonly used exclusive Kullback-Leibler divergence. Encouraging empirical results have been obtained with the family of alpha-divergences, but few works have focused on the asymptotic properties of the proposed algorithms, especially as the number of iterations goes to infinity. In this paper, we study a procedure that ensures a monotonic decrease in the alpha-divergence. We provide sufficient conditions to guarantee its convergence to a local minimizer of the alpha-divergence at a geometric rate when the variational family belongs to the class of exponential models. The sample-based version of this ideal procedure involves biased gradient estimators, thus hindering any theoretical study. We propose an alternative unbiased algorithm, we prove its almost sure convergence to a local minimizer of the alpha-divergence, and a law of the iterated logarithm. Our results are exemplified with toy and real-data experiments.
Primary Area: Probabilistic methods (for example: variational inference, Gaussian processes)
Submission Number: 7468
Loading