Improved Analysis for Sign-based Methods with Momentum Updates

ICLR 2026 Conference Submission15147 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Stochastic optimization, non-convex optimization, sign-based methods, convergence guarantees
Abstract: This paper presents enhanced analysis for sign-based optimization algorithms with momentum updates. Traditional sign-based methods obtain a convergence rate of $\mathcal{O}(T^{-1/4})$ under the separable smoothness assumption, but they typically require large batch sizes or assume unimodal symmetric stochastic noise. To address these limitations, we demonstrate that signSGD with momentum can achieve the same convergence rate using constant batch sizes without additional assumptions. We also establish a convergence rate under the $l_2$-smoothness condition, improving upon the result of the prior momentum-based signSGD variant by a factor of $\mathcal{O}(d^{1/2})$, where $d$ is the problem dimension. Furthermore, we explore sign-based methods with majority vote in distributed settings and show that the proposed momentum-based method yields convergence rates of $\mathcal{O}\left( d^{1/2}T^{-1/2} + dn^{-1/2} \right)$ and $\mathcal{O}\left( \max \\{ d^{1/4}T^{-1/4}, d^{1/10}T^{-1/5} \\} \right)$, which outperform the previous results of $\mathcal{O}\left( dT^{-1/4} + dn^{-1/2} \right)$ and $\mathcal{O}\left( d^{3/8}T^{-1/8} \right)$, respectively. Numerical experiments also validate the effectiveness of the proposed methods.
Primary Area: optimization
Submission Number: 15147
Loading