Keywords: Stochastic gradient algorithm, accelerated adaptive momentum estimation, convergence analysis, low-light image enhancement
Abstract: We develop an enhanced version of the Adam algorithm for solving ``nonconvex + weakly-convex" composite optimizations, termed Stochastic Two-track Nesterov-accelerated Adaptive Momentum Estimation (STNAdam). A featured difference from the existing accelerated variants of Adam is that STNAdam adopts a novel two-track iteration framework, which maintains two intertwined iteration trajectories including an extrapolation track and a regular update track, governed by Nesterov momentum and Adam-style adaptive conditioning, respectively. It aims to promote the formation of a larger update neighborhood, while exploring a better iteration direction continuously. The stochastic gradient in STNAdam is allowed to be provided by arbitrary a variance-reduced gradient estimator, such as SVRG, SAGA and SARAH. The internal hyper-parameters generated along with this can be dynamically scheduled within some iterate-dependent finite intervals. Under the Kurdyka-{\L}ojasiewicz property, we show that the sequence generated by STNAdam almost surely converges to a stationary point of the original problem at an explicit rate. Empirical results on low-light image enhancement are presented to demonstrate the superior performance of our proposed method.
Primary Area: optimization
Submission Number: 9356
Loading