Batch, match, and patch: low-rank approximations for score-based variational inference
TL;DR: We develop a score-based variational inference algorithm that fits a Gaussian with a structured covariance matrix.
Abstract: Black-box variational inference (BBVI) scales poorly to high-dimensional problems when it is used to estimate a multivariate Gaussian
approximation with a full covariance matrix. In this paper, we extend the _batch-and-match_ (BaM) framework for score-based BBVI to
problems where it is prohibitively expensive to store such covariance matrices, let alone to estimate them. Unlike classical algorithms for
BBVI, which use stochastic gradient descent to minimize the reverse Kullback-Leibler divergence, BaM uses more specialized updates
to match the scores of the target density and its Gaussian approximation. We extend the updates for BaM by integrating them with a more compact parameterization of full covariance matrices. In particular, borrowing ideas from factor analysis, we add an extra step to
each iteration of BaM---a _patch_---that projects each newly updated covariance matrix into a more efficiently parameterized family of diagonal plus low rank matrices. We evaluate this approach on a variety of synthetic target distributions and real-world problems in
high-dimensional inference.
Submission Number: 1716
Loading