A Rate--Distortion View on Model UpdatesDownload PDF

01 Mar 2023 (modified: 01 Jun 2023)Submitted to Tiny Papers @ ICLR 2023Readers: Everyone
Keywords: Federated Learning, Information Theory, Compression
TL;DR: We frame communication of model updates in federated learning as a rate-distortion optimization problem, and present an empirically justified compression algorithm with superior rate-distortion performance compared to existing approaches.
Abstract: Compressing model updates is critical for reducing communication costs in federated learning. We examine the problem using rate--distortion theory to present a compression method that is near-optimal in many use cases. We empirically show that common transforms applied to model updates in standard compression algorithms, normalization in QSGD and random rotation in DRIVE, yield sub-optimal compressed representations in practice.
6 Replies

Loading