Revisiting High-Resolution ODEs for Faster Convergence Rates

16 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: optimization
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Convex optimization, first-order method, ordinary differential equation, accelerated method, Lyapunov function, convergence rate, semi-implicit Euler, high-resolution ODE, gradient minimization
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We show that high-resolution ODEs are recovered from a general ODE whose discretization reduces exactly to 1st-order accelerated methods and is used to prove faster convergence rates than the Lyapunov based results for recovered ODEs and algorithms.
Abstract: There has been a growing interest in high-resolution ordinary differential equations (HR-ODEs) for investigating the dynamics and convergence characteristics of momentum-based optimization algorithms. As a result, the literature includes a number of HR-ODEs that represent diverse methods. In this work, we demonstrate that these different HR-ODEs can be unified as special cases of a general HR-ODE model with varying parameters. In addition, by using the integral quadratic constraints from robust control theory, we introduce a general Lyapunov function for the convergence analysis of the proposed HR-ODE. Not only can a large number of popular optimization algorithms be viewed as discretizations of our general HR-ODE, but our analysis also leads to several critical improvements in the convergence guarantees of these methods, both in continuous and discrete-time settings. The notable improvements include enhanced convergence guarantees, compared to prior art, for the triple momentum method ODE in continuous-time and for the quasi hyperbolic momentum algorithm in discrete-time settings.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 667
Loading