Keywords: Quasi Newton, Nonlinear acceleration, Accelerated, Convergence, Rate, Adaptive, Nonasymptotic
TL;DR: The paper proposes a generic method with global accelerated convergence rates that has strong links to quasi-Newton methods, nonlinear algorithms, and finite difference schemes
Abstract: Despite the impressive numerical performance of quasi-Newton and Anderson/nonlinear acceleration methods, their global convergence rates have remained elusive for over 50 years. This paper addresses this long-standing question by introducing a framework that derives novel and adaptive quasi-Newton or nonlinear/Anderson acceleration schemes. Under mild assumptions, the proposed iterative methods exhibit explicit, non-asymptotic convergence rates that blend those of gradient descent and Cubic Regularized Newton's method. Notably, these rates are achieved adaptively, as the method autonomously determines the optimal step size using a simple backtracking strategy. The proposed approach also includes an accelerated version that improves the convergence rate on convex functions. Numerical experiments demonstrate the efficiency of the proposed framework, even compared to a fine-tuned BFGS algorithm with line search.
Supplementary Material: pdf
Submission Number: 2910
Loading