Q-DADAM: A Quantized Distributed Online Optimization Algorithm With Adaptive Momentum

Published: 2025, Last Modified: 04 Nov 2025IEEE Trans. Control. Netw. Syst. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: This article investigates distributed online optimization for a group of agents communicating on undirected networks. The objective is to collaboratively minimize the sum of locally known convex cost functions while overcoming communication bandwidth limitations. To tackle this challenge, we propose the Q-DADAM algorithm, a quantized distributed adaptive momentum method that ensures that agents interact with neighbors to optimize the global cost function collectively. Unlike many existing distributed online optimization algorithms that overlook communication bandwidth constraints, the Q-DADAM algorithm involves random quantization to effectively reduce the data transmission volume, making it more practical for applications with limited channel capacity. Different from existing algorithms that neglect adaptive momentum methods, the Q-DADAM algorithm incorporates these adaptive momentum methods, contributing to improved convergence and superior performance. Theoretical analysis demonstrates that the Q-DADAM algorithm with appropriate step size and quantization level can reduce communication traffic and achieve sublinear dynamic regret. Simulation experiments validate the practicality and effectiveness of the Q-DADAM algorithm. In addition, we discuss the impacts on the convergence of the Q-DADAM algorithm under different quantization levels and the number of agents.
Loading