Keywords: variational inequalities, distributed optimization, similarity, statistical preconditioning, proximal method, Bregman setup
Abstract: We propose a novel stochastic distributed method for both monotone and strongly monotone variational inequalities with Lipschitz operator and proper convex regularizers arising in various applications from game theory to adversarial training. By exploiting \textit{similarity}, our algorithm overcomes the communication bottleneck that is a major issue in distributed optimization. The proposed method enjoys optimal communication complexity. All the existing distributed algorithms achieving the lower bounds under similarity condition essentially utilize the Euclidean setup. In contrast to them, our method is built upon the Bregman proximal maps and it is compatible with
an arbitrary problem geometry. Thereby the proposed method fills an existing gap in this area of research. Our theoretical results are confirmed by numerical experiments on a stochastic matrix game.
Submission Number: 26
Loading