GT-SAGA: A fast incremental gradient method for decentralized finite-sum minimizationDownload PDFOpen Website

2020 (modified: 17 May 2022)CDC 2020Readers: Everyone
Abstract: In this paper, we study decentralized solutions for finite-sum minimization problems when the underlying training data is distributed over a network of nodes. In particular, we describe the GT-SAGA algorithm that combines variance reduction and gradient tracking to achieve both robust performance and fast convergence. Variance reduction is implemented locally to asymptotically estimate each local batch gradient at each node, while gradient tracking fuses the local estimated gradients across the nodes. Combining variance reduction and gradient tracking thus enables linear convergence to the optimal solution of strongly-convex problems while keeping a low periteration computation complexity at each node. We cast the convergence and behavior of GT-SAGA and related methods in the context of certain practical tradeoffs and further compare their performance over a logistic regression problem with strongly convex regularization.
0 Replies

Loading