Fluctuation-dissipation relations for stochastic gradient descentDownload PDF

Sep 27, 2018 (edited Dec 21, 2018)ICLR 2019 Conference Blind SubmissionReaders: Everyone
  • Abstract: The notion of the stationary equilibrium ensemble has played a central role in statistical mechanics. In machine learning as well, training serves as generalized equilibration that drives the probability distribution of model parameters toward stationarity. Here, we derive stationary fluctuation-dissipation relations that link measurable quantities and hyperparameters in the stochastic gradient descent algorithm. These relations hold exactly for any stationary state and can in particular be used to adaptively set training schedule. We can further use the relations to efficiently extract information pertaining to a loss-function landscape such as the magnitudes of its Hessian and anharmonicity. Our claims are empirically verified.
  • Keywords: stochastic gradient descent, adaptive method, loss surface, Hessian
  • TL;DR: We prove fluctuation-dissipation relations for SGD, which can be used to (i) adaptively set learning rates and (ii) probe loss surfaces.
8 Replies