Independent versus truncated finite approximations for Bayesian nonparametric inferenceDownload PDF

Published: 09 Dec 2020, Last Modified: 05 May 2023ICBINB 2020 OralReaders: Everyone
Keywords: Bayesian nonparametrics, finite approximations, error bounds, topic modelling, dictionary learning
TL;DR: The worst-case approximation error of an independent finite approximation is larger than that of a truncated finite approximation, but on two real-data analyses, these two approximations have similar performance.
Abstract: Bayesian nonparametric models based on completely random measures (CRMs) offers flexibility when the number of clusters or latent components in a data set is unknown. However, managing the infinite dimensionality of CRMs often leads to slow computation during inference. Practical inference typically relies on either integrating out the infinite-dimensional parameter or using a finite approximation: a truncated finite approximation (TFA) or an independent finite approximation (IFA). The atom weights of TFAs are constructed sequentially, while the atoms of IFAs are independent, which facilitates more convenient inference schemes. While the approximation error of TFA has been systematically addressed, there has not yet been a similar study of IFA. We quantify the approximation error between IFAs and two common target nonparametric priors (beta-Bernoulli process and Dirichlet process mixture model) and prove that, in the worst-case, TFAs provide more component-efficient approximations than IFAs. However, in experiments on image denoising and topic modeling tasks with real data, we find that the error of Bayesian approximation methods overwhelms any finite approximation error, and IFAs perform very similarly to TFAs.
1 Reply

Loading