Adaptive Accelerated Gradient Descent Methods for Convex Optimization

ICLR 2026 Conference Submission15319 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Convex optimization, composite convex optimization, adaptive gradient descent, accelerated gradient descent, adaptive momentum
TL;DR: This work introduces A$^2$GD, an adaptive and accelerated gradient method that leverages ODE-inspired stability and Lyapunov-based parameter updates to achieve superior performance in convex and composite optimization.
Abstract: We propose A$^2$GD, an adaptive accelerated gradient method for convex and composite optimization. Drawing inspiration from stability analysis in ODE solvers, the method updates smoothness and convexity constants through Lyapunov-based formulas and invokes line search only when accumulated perturbations turn positive, an event that is empirically rare. This yields a dramatic cut in gradient evaluations while preserving strong theoretical guarantees. By integrating adaptive step size and momentum acceleration, A$^2$GD outperforms existing first-order methods across diverse problem settings.
Supplementary Material: zip
Primary Area: optimization
Submission Number: 15319
Loading