Data-heterogeneity-aware Mixing for Decentralized LearningDownload PDF

Published: 23 Nov 2022, Last Modified: 05 May 2023OPT 2022 PosterReaders: Everyone
Keywords: decentralized learning, optimization, gossip
TL;DR: We provide a theoretical analysis of the interaction between the mixing matrix and data heterogeneity in decentralized SGD and propose an algorithm that efficiently adapts the mixing matrix to the heterogeneity of the gradients.
Abstract: Decentralized learning provides an effective framework to train machine learning models with data distributed over arbitrary communication graphs. However, most existing approaches towards decentralized learning disregard the interaction between data heterogeneity and graph topology. In this paper, we characterize the dependence of convergence on the relationship between the mixing weights of the graph and the data heterogeneity across nodes. We propose a metric that quantifies the ability of a graph to mix the current gradients. We further prove that the metric controls the convergence rate, particularly in settings where the heterogeneity across nodes dominates the stochasticity between updates for a given node. Motivated by our analysis, we propose an approach that periodically and efficiently optimizes the metric using standard convex constrained optimization and sketching techniques.
0 Replies

Loading