FedSZ: Leveraging Error-Bounded Lossy Compression for Federated Learning Communications

Published: 01 Jan 2024, Last Modified: 09 Nov 2024IPDPS (Workshops) 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: We introduce FEDSZ, a lossy-compression algorithm designed to reduce the size of local model updates in federated learning (FL). FEDSZ incorporates a compression pipeline with data partitioning, lossy and lossless compression, and serialization. Experiments reveal that a relative error bound of 10 −2 achieves an optimal trade-off, compressing model states between 5.55-12.61 × while maintaining accuracy within < 0.5% of uncompressed results. The runtime overhead of FEDSZ is < 4.7%, and it significantly reduces network transfer times. These findings validate FEDSZ's effectiveness for balancing communication efficiency and model performance in FL.
Loading