Locking and Quacking: Stacking Bayesian models predictions by log-pooling and superpositionDownload PDF

Published: 29 Nov 2022, Last Modified: 05 May 2023SBM 2022 PosterReaders: Everyone
Keywords: score matching, Hyvarinen score, model combination
TL;DR: New ways to combine predictives that bypass computing the normalising constant
Abstract: Combining predictive distributions is a central problem in Bayesian inference and machine learning. Currently, predictives are almost exclusively combined using linear density-mixtures such as Bayesian model averaging, Bayesian stacking, and mixture of experts. Nonetheless, linear mixtures impose traits that might be undesirable for some applications, such as multi-modality. While there are alternative strategies (e.g., geometric bridge or superposition), optimizing their parameters usually implies computing intractable normalizing constant repeatedly. In this extended abstract, we present two novel Bayesian model combination tools. They are generalizations of \emph{stacking}, but combine posterior densities by log-linear pooling (\emph{locking}) and quantum superposition (\emph{quacking}). To optimize model weights while avoiding the burden of normalizing constants, we maximize the Hyv\"arinen score of the combined posterior predictions. We demonstrate locking and quacking with an illustrative example.
Student Paper: No
1 Reply

Loading