Keywords: Gibbs sampling, MCMC methods, quantum sampling, non-convex optimization, quantum gradient estimation, LMC, HMC, quantum algorithm
TL;DR: We propose quantum algorithms to reduce the evaluation query complexities of HMC and LMC for sampling and optimization.
Abstract: We propose quantum algorithms with provable speedups for sampling from probability distributions of the form $\pi \propto e^{-f}$, where $f:\mathbb{R}^d\mapsto \mathbb{R}$ is a potential function. In particular, we consider access only to a stochastic evaluation oracle, allowing simultaneous queries of the potential value at two different points under the same stochastic parameter. By introducing novel quantum algorithms for stochastic gradient estimation in this setting, our algorithms improve the evaluation complexities of classical samplers, such as Hamiltonian Monte Carlo (HMC) and Langevin Monte Carlo (LMC) in terms of dimension, precision, and other problem-dependent parameters. Furthermore, we demonstrate that our quantum sampling algorithms can be used to achieve quantum speedups in optimization, particularly for minimizing nonsmooth and approximately convex functions that commonly appear in empirical risk minimization problems.
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Submission Number: 22835
Loading