Pseudo-Bayesian Learning via Direct Loss Minimization with Applications to Sparse Gaussian Process ModelsDownload PDF

16 Oct 2019 (modified: 05 May 2023)AABI 2019Readers: Everyone
Keywords: sparse gaussian processes, empirical risk minimization, agnostic pac bound
TL;DR: This paper utilizes the analysis of Lipschitz loss on a bounded hypothesis space to derive new ERM-type algorithms with strong performance guarantees that can be applied to the non-conjugate sparse GP model.
Abstract: We propose that approximate Bayesian algorithms should optimize a new criterion, directly derived from the loss, to calculate their approximate posterior which we refer to as pseudo-posterior. Unlike standard variational inference which optimizes a lower bound on the log marginal likelihood, the new algorithms can be analyzed to provide loss guarantees on the predictions with the pseudo-posterior. Our criterion can be used to derive new sparse Gaussian process algorithms that have error guarantees applicable to various likelihoods.
0 Replies

Loading