Adaptive Polyak Step-Size for Momentum Accelerated Stochastic Gradient Descent With General Convergence Guarantee

Jiawei Zhang, Cheng Jin, Yuantao Gu

Published: 01 Jan 2025, Last Modified: 06 Nov 2025IEEE Transactions on Signal ProcessingEveryoneRevisionsCC BY-SA 4.0
Abstract: Momentum accelerated stochastic gradient descent (SGDM) has gained significant popularity in several signal processing and machine learning tasks. Despite its widespread success, the step size of SGDM remains a critical hyperparameter affecting its performance and often requires manual tuning. Recently, some works have introduced the Polyak step size to SGDM and provided corresponding convergence analysis. However, the convergence guarantee of existing Polyak step sizes for SGDM are limited to convex objectives and lack theoretical support for more widely applicable non-convex problems. To bridge this gap, we design a novel Polyak adaptive step size for SGDM. The proposed algorithm, termed SGDM-APS, incorporates a moving average form tailored for the momentum mechanism in SGDM. We establish the convergence guarantees of SGDM-APS for both convex and non-convex objectives, providing theoretical analysis of its effectiveness. To the best of our knowledge, SGDM-APS is the first Polyak step size for SGDM with general convergence guarantee. Our analysis can also be extended to constant step size SGDM, enriching the theoretical comprehension of the classic SGDM algorithm. Through extensive experiments on diverse benchmarks, we demonstrate that SGDM-APS achieves competitive convergence rates and generalization performance compared to several popular optimization algorithms.
Loading