Mean-Square Analysis with An Application to Optimal Dimension Dependence of Langevin Monte CarloDownload PDF

21 May 2021 (modified: 05 May 2023)NeurIPS 2021 SubmittedReaders: Everyone
Keywords: mean-square analysis, SDE-based sampling algorithm, Langevin Monte Carlo, dimension dependence, non-asymptotic error analysis
TL;DR: This paper proposes a framework for error analysis of SDE-based sampling algorithms, and obtains optimal dimension dependence when applied to Langevin Monte Carlo.
Abstract: Sampling algorithms based on discretizations of Stochastic Differential Equations (SDEs) compose a rich and popular subset of MCMC methods. This work provides a general framework for the non-asymptotic analysis of sampling error in 2-Wasserstein distance, which also leads to a bound of mixing time. The method applies to any consistent discretization of contractive SDEs. When applied to Langevin Monte Carlo algorithm, it establishes $\widetilde{\mathcal{O}}\left(\frac{\sqrt{d}}{\epsilon}\right)$ mixing time, without warm start, under the common log-smooth and log-strongly-convex conditions, plus a growth condition on the potential of target measures at infinity. This bound improves the best previously known $\widetilde{\mathcal{O}}\left(\frac{d}{\epsilon}\right)$ result and is optimal in both dimension $d$ and accuracy tolerance $\epsilon$ for log-smooth and log-strongly-convex target measures. Our theoretical analysis is further validated by numerical experiments.
Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
Supplementary Material: zip
22 Replies

Loading