Generalized Gradient Flows With Provable Fixed-Time Convergence and Fast Evasion of Non-Degenerate Saddle Points
Abstract: Gradient-based first-order convex optimization algorithms find widespread applicability in a variety of domains, including machine learning tasks. Motivated by the recent advances in fixed-time stability theory of continuous-time dynamical systems, we introduce a generalized framework for designing accelerated optimization algorithms with strongest convergence guarantees that further extend to a subclass of nonconvex functions. In particular, we introduce the GenFlow algorithm and its momentum variant that provably converge to the optimal solution of objective functions satisfying the Polyak–Łojasiewicz inequality in a fixed time. Moreover, for functions that admit nondegenerate saddle points, we show that for the proposed GenFlow algorithm, the time required to evade these saddle points is uniformly bounded for all initial conditions. Finally, for strongly convex–strongly concave minimax problems whose optimal solution is a saddle point, a similar scheme is shown to arrive at the optimal solution again in a fixed time. The superior convergence properties of our algorithm are validated experimentally on a variety of benchmark datasets.
Loading