A Smooth Optimisation Perspective on Training Feedforward Neural NetworksDownload PDF

03 May 2025 (modified: 22 Feb 2017)ICLR 2017Readers: Everyone
Abstract: We present a smooth optimisation perspective on training multilayer Feedforward Neural Networks (FNNs) in the supervised learning setting. By characterising the critical point conditions of an FNN based optimisation problem, we identify the conditions to eliminate local optima of the cost function. By studying the Hessian structure of the cost function at the global minima, we develop an approximate Newton FNN algorithm, which demonstrates promising convergence properties.
Conflicts: tum.de
Keywords: Theory, Supervised Learning, Optimization
5 Replies

Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview