Abstract: Federated learning (FL) has gained a lot of attention in recent years for building privacy-preserving collaborative learning systems. However, FL algorithms for constrained machine learning problems are still limited, particularly when the projection step is costly. To this end, we propose a Federated Frank-Wolfe Algorithm (FedFW). FedFW features data privacy, low per-iteration cost, and communication of sparse signals. In the deterministic setting, FedFW achieves an \(\varepsilon \)-suboptimal solution within \(\mathcal {O}(\varepsilon ^{-2})\) iterations for smooth and convex objectives, and \(\mathcal {O}(\varepsilon ^{-3})\) iterations for smooth but non-convex objectives. Furthermore, we present a stochastic variant of FedFW and show that it finds a solution within \(\mathcal {O}(\varepsilon ^{-3})\) iterations in the convex setting. We demonstrate the empirical performance of FedFW on several machine learning tasks.
Loading