Mean-Field Langevin Dynamics : Exponential Convergence and Annealing

Published: 11 Aug 2022, Last Modified: 28 Feb 2023Accepted by TMLREveryoneRevisionsBibTeX
Abstract: Noisy particle gradient descent (NPGD) is an algorithm to minimize convex functions over the space of measures that include an entropy term. In the many-particle limit, this algorithm is described by a Mean-Field Langevin dynamics---a generalization of the Langevin dynamic with a non-linear drift---which is our main object of study. Previous work have shown its convergence to the unique minimizer via non-quantitative arguments. We prove that this dynamics converges at an exponential rate, under the assumption that a certain family of Log-Sobolev inequalities holds. This assumption holds for instance for the minimization of the risk of certain two-layer neural networks, where NPGD is equivalent to standard noisy gradient descent. We also study the annealed dynamics, and show that for a noise decaying at a logarithmic rate, the dynamics converges in value to the global minimizer of the unregularized objective function.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: - corrected and clarified the proof of Thm. 4.1, following the remarks of the reviewers. - added Section 5.2 that contains numerical experiments, following the suggestion of the reviewers. - added reference to previous quantitative rates in the large noise regime and the displacement convex case. - extended the discussion on the LSI assumption for two-layer neural networks
Code: https://github.com/lchizat/2022-mean-field-langevin-rate
Assigned Action Editor: ~Murat_A_Erdogdu1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 110
Loading