Distributed Optimization for Quadratic Cost Functions With Quantized Communication and Finite-Time Convergence
Abstract: In this article, we propose two distributed iterative algorithms that can be used to solve the distributed optimization problem for quadratic local cost functions over large-scale networks in finite time. The first algorithm exhibits synchronous operation while the second one exhibits asynchronous operation. Both algorithms operate exclusively with quantized values. This means that the information stored, processed, and exchanged between neighboring nodes is subject to deterministic uniform quantization. The algorithms rely on event-driven updates in order to reduce energy consumption, communication bandwidth, network congestion, and/or processor usage. Finally, once the algorithms converge, nodes distributively terminate their operation. We prove that our algorithms converge in a finite number of iterations to the exact optimal solution depending on the quantization level, and we present applications of our algorithms to, first, optimal task scheduling for data centers, and second, global model aggregation for distributed federated learning. We provide simulations of these applications to illustrate the operation, performance, and advantages of the proposed algorithms. In addition, it is shown that our proposed algorithms compare favorably to algorithms in the current literature.
Loading