Kronecker-factored Curvature Approximations for Recurrent Neural NetworksDownload PDF

15 Feb 2018, 21:29 (edited 10 Feb 2022)ICLR 2018 Conference Blind SubmissionReaders: Everyone
  • Keywords: optimization, K-FAC, natural gradient, recurrent neural networks
  • TL;DR: We extend the K-FAC method to RNNs by developing a new family of Fisher approximations.
  • Abstract: Kronecker-factor Approximate Curvature (Martens & Grosse, 2015) (K-FAC) is a 2nd-order optimization method which has been shown to give state-of-the-art performance on large-scale neural network optimization tasks (Ba et al., 2017). It is based on an approximation to the Fisher information matrix (FIM) that makes assumptions about the particular structure of the network and the way it is parameterized. The original K-FAC method was applicable only to fully-connected networks, although it has been recently extended by Grosse & Martens (2016) to handle convolutional networks as well. In this work we extend the method to handle RNNs by introducing a novel approximation to the FIM for RNNs. This approximation works by modelling the covariance structure between the gradient contributions at different time-steps using a chain-structured linear Gaussian graphical model, summing the various cross-covariances, and computing the inverse in closed form. We demonstrate in experiments that our method significantly outperforms general purpose state-of-the-art optimizers like SGD with momentum and Adam on several challenging RNN training tasks.
  • Data: [Penn Treebank](https://paperswithcode.com/dataset/penn-treebank)
14 Replies

Loading