Accelerated Methods with Complexity Separation Under Data Similarity for Federated Learning Problems
Keywords: Convex optimization, Data similarity, Composite optimization, Stochastic optimization
TL;DR: The federated learning problem is studied as a composite problem with multiple data similarity constants
Abstract: Heterogeneity within data distribution is an issue encountered in many modern federated learning tasks. We formalize it as an optimization problem with a computationally heavy composite under data similarity. Using different sets of assumptions, we present several approaches to construct communication-efficient methods. An optimal algorithm is proposed for a convex composite. The constructed theory is validated through a series of experiments on various problems such as classification of CIFAR-10 with ResNet-18.
Primary Area: optimization
Submission Number: 8861
Loading