TL;DR: Gradient-based optimizer with Gaussian process derivative estimator performs state-of-the-art in variational quantum eigensolvers.
Abstract: Parameter shift rules (PSRs) are key techniques for efficient gradient estimation in variational quantum eigensolvers (VQEs). In this paper, we propose its Bayesian variant, where Gaussian processes with appropriate kernels are used to estimate the gradient of the VQE objective. Our Bayesian PSR offers flexible gradient estimation from observations at arbitrary locations with uncertainty information, and reduces to the generalized PSR in special cases. In stochastic gradient descent (SGD), the flexibility of Bayesian PSR allows reuse of observations in previous steps, which accelerates the optimization process. Furthermore, the accessibility to the posterior uncertainty,
along with our proposed notion of gradient confident region (GradCoRe), enables us to minimize the observation costs in each SGD step.
Our numerical experiments show that the VQE optimization with Bayesian PSR and GradCoRe significantly accelerates SGD, and outperforms the state-of-the-art methods, including sequential minimal optimization.
Primary Area: Probabilistic Methods->Gaussian Processes
Keywords: parameter shift rule, variational quantum eigensolver, quantum computing, confidence region, Gaussian process
Submission Number: 7387
Loading