ProxSkip for Stochastic Variational Inequalities: A Federated Learning Algorithm for Provable Communication AccelerationDownload PDF

Published: 23 Nov 2022, Last Modified: 05 May 2023OPT 2022 PosterReaders: Everyone
Abstract: Recently Mishchenko et al. (2022) proposed and analyzed ProxSkip, a provably efficient method for minimizing the sum of a smooth $(f)$ and an expensive nonsmooth proximable $(R)$ function (i.e. $\min_{x \in \mathbb{R}^d} f(x) + R(x)$). The main advantage of ProxSkip, is that in the federated learning (FL) setting, offers provably an effective acceleration of communication complexity. This work extends this approach to the more general regularized variational inequality problems (VIP). In particular, we propose ProxSkip-VIP algorithm, which generalizes the original ProxSkip framework of Mishchenko et al. (2022) to VIP, and we provide convergence guarantees for a class of structured non-monotone problems. In the federated learning setting, we explain how our approach achieves acceleration in terms of the communication complexity over existing state-of-the-art FL algorithms.
0 Replies