Exponentially Decaying Flows for Optimization in Deep LearningDownload PDF

27 Sep 2018 (modified: 18 Dec 2018)ICLR 2019 Conference Withdrawn SubmissionReaders: Everyone
  • Abstract: The field of deep learning has been craving for an optimization method that shows outstanding property for both optimization and generalization. We propose a method for mathematical optimization based on flows along geodesics, that is, the shortest paths between two points, with respect to the Riemannian metric induced by a non-linear function. In our method, the flows refer to Exponentially Decaying Flows (EDF), as they can be designed to converge on the local solutions exponentially. In this paper, we conduct experiments to show its high performance on optimization benchmarks (i.e., convergence properties), as well as its potential for producing good machine learning benchmarks (i.e., generalization properties).
  • Keywords: optimization, deep learning
  • TL;DR: Introduction of a new optimization method and its application to deep learning.
4 Replies