Decentralized Federated Learning for Over-Parameterized ModelsDownload PDFOpen Website

Published: 01 Jan 2022, Last Modified: 21 Feb 2024CDC 2022Readers: Everyone
Abstract: Modern machine learning, especially deep learning, features models that are often highly expressive and over-parameterized. They can interpolate the data by driving the empirical loss close to zero. We analyze the convergence rate of decentralized stochastic gradient descent (SGD), which is at the core of decentralized federated learning (DFL), for these over-parameterized models. Our analysis covers the setting of decentralized SGD with time-varying networks, local updates and heterogeneous data. We establish strong convergence guarantees with or without the assumption of convex objectives that either improves upon the existing literature or is the first for the regime.
0 Replies

Loading