Averaging Rate Scheduler for Decentralized Learning on Heterogeneous Data

Published: 19 Mar 2024, Last Modified: 30 Mar 2024Tiny Papers @ ICLR 2024 PresentEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Decentralized Learning, Federated Learning, Heterogeneous Data, Distributed Training
TL;DR: We propose averaging rate scheduling as a simple yet effective way to reduce the impact of heterogeneity in decentralized learning.
Abstract: Presently, state-of-the-art decentralized learning algorithms typically require the data distribution to be Independent and Identically Distributed (IID). However, in practical scenarios, the data distribution across the agents can have significant heterogeneity. In this work, we propose averaging rate scheduling as a simple yet effective way to reduce the impact of heterogeneity in decentralized learning. Our experiments illustrate the superiority of the proposed method ($\sim 3\%$ improvement in test accuracy) compared to the conventional approach of employing a constant averaging rate.
Submission Number: 38
Loading