Almost multisecant BFGS quasi-Newton method

Published: 26 Oct 2023, Last Modified: 13 Dec 2023NeurIPS 2023 Workshop PosterEveryoneRevisionsBibTeX
Keywords: quasi Newton, BFGS, multisecant, convex optimization
Abstract: Quasi-Newton (QN) methods provide an alternative to second-order techniques for solving minimization problems by approximating curvature. This approach reduces computational complexity as it relies solely on first-order information, and satisfying the secant condition. This paper focuses on multi-secant (MS) extensions of QN for convex optimization problems, which enhances the Hessian approximation at low cost. Specifically, we use a low-rank perturbation strategy to construct an almost-secant QN method that maintains positive definiteness of the Hessian estimate, which in turn helps ensure constant descent (and reduces method divergence). Our results show that careful tuning of the updates greatly improve stability and effectiveness of multisecant updates.
Submission Number: 86