Newton Method Revisited: Global Convergence Rates up to $O(1/k^3)$ for Stepsize Schedules and Linesearch Procedures
Keywords: Damped Newton Methods, Tensor Methods, Linesearch Procedures, Global Convergence Guarantees
TL;DR: We analyze the stepsized Newton method under various Holder continuity assumptions, including Holder continuity of third derivatives. We present the first stepsize schedule with $\mathcal O(k^{-3})$ global convergence rate.
Abstract: This paper investigates the global convergence of stepsized Newton methods for convex functions with Hölder continuous Hessians or third derivatives. We propose several simple stepsize schedules with fast global convergence guarantees, up to $\mathcal O( k^{-3} )$. For cases with multiple plausible smoothness parameterizations or an unknown smoothness constant, we introduce a stepsize linesearch and a backtracking procedure with provable convergence as if the optimal smoothness parameters were known in advance.
Additionally, we present strong convergence guarantees for the practically popular Newton method with exact linesearch.
Primary Area: optimization
Submission Number: 2053
Loading