Convergence of stochastic gradient descent on parameterized sphere with applications to variational Monte Carlo simulationDownload PDFOpen Website

Published: 01 Jan 2023, Last Modified: 15 May 2023CoRR 2023Readers: Everyone
Abstract: We analyze stochastic gradient descent (SGD) type algorithms on a high-dimensional sphere which is parameterized by a neural network up to a normalization constant. We provide a new algorithm for the setting of supervised learning and show its convergence both theoretically and numerically. We also provide the first proof of convergence for the unsupervised setting, which corresponds to the widely used variational Monte Carlo (VMC) method in quantum physics.
0 Replies

Loading