Asynchronous Federated Learning with Bidirectional Quantized Communications and Buffered Aggregation
Keywords: Federated Learning, Asynchronous, Quantized Communications, Compressed Communications
TL;DR: We present a practical algorithm for asynchronous FL that reduces communication costs, and show a theoretical proof of its performance.
Abstract: Asynchronous Federated Learning with Buffered Aggregation (FedBuff) is a state-of-the-art algorithm known for its efficiency and high scalability.
However, it has a high communication cost, which has not been examined with quantized communications.
To tackle this problem, we present a new algorithm (QAFeL), with a quantization scheme that establishes a shared "hidden'' state between the server and clients to avoid the error propagation caused by direct quantization.
This approach allows for high precision while significantly reducing the data transmitted during client-server interactions.
We provide theoretical convergence guarantees for QAFeLand corroborate our analysis with experiments on a standard benchmark.
Submission Number: 3
Loading