Stochastic gradient algorithms from ODE splitting perspectiveDownload PDF

Published: 27 Feb 2020, Last Modified: 05 May 2023ICLR 2020 Workshop ODE/PDE+DL PosterReaders: Everyone
Keywords: SGD, Splitting, ODE
TL;DR: We treat SGD as a splitting scheme for continious gradient flow equation and show, that more accurate solution of each local problem leads to the great robustness in stepsize choosing.
Abstract: We present a different view on stochastic optimization, which goes back to the splitting schemes for approximate solutions of ODE. In this work, we provide a connection between stochastic gradient descent approach and first-order splitting scheme for ODE. We consider the special case of splitting, which is inspired by machine learning applications and derive a new upper bound on the global splitting error for it. We present, that the Kaczmarz method is the limit case of the splitting scheme for the unit batch SGD for linear least squares problem. We support our findings with systematic empirical studies, which demonstrates, that a more accurate solution of local problems leads to the stepsize robustness and provides better convergence in time and iterations on the softmax regression problem.
1 Reply

Loading