Faster first-order primal-dual methods for linear programming using restarts and sharpness

David L. Applegate, Oliver Hinder, Haihao Lu, Miles Lubin

Published: 2023, Last Modified: 09 May 2026Math. Program. 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: First-order primal-dual methods are appealing for their low memory overhead, fast iterations, and effective parallelization. However, they are often slow at finding high accuracy solutions, which creates a barrier to their use in traditional linear programming (LP) applications. This paper exploits the sharpness of primal-dual formulations of LP instances to achieve linear convergence using restarts in a general setting that applies to alternating direction method of multipliers (ADMM), primal-dual hybrid gradient method (PDHG) and extragradient method (EGM). In the special case of PDHG, without restarts we show an iteration count lower bound of \(\Omega (\kappa ^2 \log (1/\epsilon ))\), while with restarts we show an iteration count upper bound of \(O(\kappa \log (1/\epsilon ))\), where \(\kappa \) is a condition number and \(\epsilon \) is the desired accuracy. Moreover, the upper bound is optimal for a wide class of primal-dual methods, and applies to the strictly more general class of sharp primal-dual problems. We develop an adaptive restart scheme and verify that restarts significantly improve the ability of PDHG, EGM, and ADMM to find high accuracy solutions to LP problems.
Loading