Keywords: federated learning, frank wolfe, conditional gradient method, projection-free, distributed optimization
TL;DR: We introduce a new Frank-Wolfe algorithm designed for the federated learning paradigm in machine learning
Abstract: Federated learning (FL) has gained much attention in recent years for building privacy-preserving collaborative learning systems. However, FL algorithms for constrained machine learning problems are still very limited, particularly when the projection step is costly. To this end, we propose a Federated Frank-Wolfe Algorithm (FedFW). FedFW provably finds an $\varepsilon$-suboptimal solution of the constrained empirical risk-minimization problem after $\mathcal{O}(\varepsilon^{-2})$ iterations if the objective function is convex. The rate becomes $\mathcal{O}(\varepsilon^{-3})$ if the objective is non-convex. The method enjoys data privacy, low per-iteration cost and communication of sparse signals. We demonstrate empirical performance of the FedFW algorithm on several machine learning tasks.
Is Student: Yes
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/federated-frank-wolfe-algorithm/code)
3 Replies
Loading