Provable Reduction in Communication Rounds for Non-Smooth Convex Federated Learning

Published: 01 Jan 2025, Last Modified: 15 May 2025CoRR 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Multiple local steps are key to communication-efficient federated learning. However, theoretical guarantees for such algorithms, without data heterogeneity-bounding assumptions, have been lacking in general non-smooth convex problems. Leveraging projection-efficient optimization methods, we propose FedMLS, a federated learning algorithm with provable improvements from multiple local steps. FedMLS attains an $\epsilon$-suboptimal solution in $\mathcal{O}(1/\epsilon)$ communication rounds, requiring a total of $\mathcal{O}(1/\epsilon^2)$ stochastic subgradient oracle calls.
Loading