Preconditioner on Matrix Lie Group for SGDDownload PDF

Published: 21 Dec 2018, Last Modified: 03 Apr 2024ICLR 2019 Conference Blind SubmissionReaders: Everyone
Abstract: We study two types of preconditioners and preconditioned stochastic gradient descent (SGD) methods in a unified framework. We call the first one the Newton type due to its close relationship to the Newton method, and the second one the Fisher type as its preconditioner is closely related to the inverse of Fisher information matrix. Both preconditioners can be derived from one framework, and efficiently estimated on any matrix Lie groups designated by the user using natural or relative gradient descent minimizing certain preconditioner estimation criteria. Many existing preconditioners and methods, e.g., RMSProp, Adam, KFAC, equilibrated SGD, batch normalization, etc., are special cases of or closely related to either the Newton type or the Fisher type ones. Experimental results on relatively large scale machine learning problems are reported for performance study.
Keywords: preconditioner, stochastic gradient descent, Newton method, Fisher information, natural gradient, Lie group
TL;DR: We propose a new framework for preconditioner learning, derive new forms of preconditioners and learning methods, and reveal the relationship to methods like RMSProp, Adam, Adagrad, ESGD, KFAC, batch normalization, etc.
Code: [![github](/images/github_icon.svg) lixilinx/psgd_torch](https://github.com/lixilinx/psgd_torch) + [![Papers with Code](/images/pwc_icon.svg) 1 community implementation](https://paperswithcode.com/paper/?openreview=Bye5SiAqKX)
10 Replies

Loading