Inexact Alternating Direction Method of Multipliers with Efficient Local Termination Criterion for Cross-silo Federated Learning

TMLR Paper2858 Authors

12 Jun 2024 (modified: 02 Jul 2024)Under review for TMLREveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Federated learning has attracted increasing attention in the machine learning community at the past five years. In this paper, we propose a new cross-silo federated learning algorithm with fast convergence guarantee to solve the machine learning models with nonsmooth regularizers. To solve this type of problems, we design an inexact federated alternating direction method of multipliers (ADMM). This method enables each agent to solve a strongly convex local problem. We introduce a new local termination criterion that can be quickly satisfied when using efficient solvers such as stochastic variance reduced gradient (SVRG). We prove that our method has faster convergence than existing methods. Moreover, we show that our proposed method has sequential convergence guarantees under the Kurdyka-\L ojasiewicz (KL) assumption. We conduct experiments using both synthetic and real datasets to demonstrate the superiority of our new methods over existing algorithms.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Konstantin_Mishchenko1
Submission Number: 2858
Loading