Faster Sampling from Gibbs Distributions with Quantum Variance Reduction

20 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Gibbs sampling, MCMC methods, quantum sampling, quantum mean estimation, LMC, HMC, quantum algorithm
TL;DR: We propose quantum algorithms to reduce the gradient query complexities of HMC and LMC for sampling from convex and non-convex potentials.
Abstract: We present quantum algorithms that provide provable speedups for approximate sampling from probability distributions of the form $\pi \propto e^{-f}$, where $f$ is a potential function that can be written as a finite sum, i.e., $f= \frac{1}{n}\sum_{i=1}^n f_i$. Our approach focuses on stochastic gradient–based methods with only oracle access to individual gradients \{$\nabla f_i$\}$_{i\in [n]}$. The techniques of our quantum algorithms are based on a non-trivial integration of quantum mean estimation techniques and existing variance reduction techniques such SVRG and CV. As these techniques often require occasional full-gradient calculations, the key challenge is that an unbalanced weighting between variance reduction and quantum mean estimation results in a regime where the quantum advantage is lost due to frequent full-gradient computation. We overcome this difficulty by carefully optimizing the target variance level. Our algorithms improve the number of gradient queries of classical samplers, such as Hamiltonian Monte Carlo (HMC) and Langevin Monte Carlo (LMC), in terms of dimension, precision, and other problem-dependent parameters.
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Submission Number: 22823
Loading