Keywords: Data Efficient ML, Energy-based Models
TL;DR: We construct BPC via Contrastive Divergence, requires no Variational approximation and allowing the use of MCMC methods for training. It requires finite MCMC steps for training, alleviating the requirement for mixing till the stationary distribution.
Abstract: Bayesian methods provide an elegant framework for estimating parameter posteriors and quantification of uncertainty associated with probabilistic models. However, they often suffer from slow inference times. To address this challenge, Bayesian Pseudo-Coresets (BPC) have emerged as a promising solution. BPC methods aim to create a small synthetic dataset, known as pseudo-coresets, that approximates the posterior inference achieved with the original dataset. This approximation is achieved by optimizing a divergence measure between the true posterior and the pseudo-coreset posterior. Various divergence measures have been proposed for constructing pseudo-coresets, with forward Kullback-Leibler (KL) divergence being the most successful. However, using forward KL divergence necessitates sampling from the pseudo-coreset posterior, often accomplished through approximate Gaussian variational distributions. Alternatively, one could employ Markov Chain Monte Carlo (MCMC) methods for sampling, but this becomes challenging in high-dimensional parameter spaces due to slow mixing.
In this study, we introduce a novel approach for constructing pseudo-coresets by utilizing contrastive divergence. Importantly, optimizing contrastive divergence eliminates the need for approximations in the pseudo-coreset construction process. Furthermore, it enables the use of finite-step MCMC methods, alleviating the requirement for extensive mixing to reach a stationary distribution. To validate our method's effectiveness, we conduct extensive experiments on multiple datasets, demonstrating its superiority over existing BPC techniques.
Our implementation is available at https://github.com/backpropagator/BPC-CD .
List Of Authors: Tiwary, Piyush and Shubham, Kumar and Kashyap, Vivek V and AP, Prathosh
Latex Source Code: zip
Signed License Agreement: pdf
Submission Number: 120
Loading