Abstract: Analyzing consensus algorithms within the context of the $r$-nearest ring networks is critical for understanding the efficiency and reliability of large-scale distributed networks. The special properties of the $r$-nearest neighbor ring offer multiple communication paths, accelerate convergence, and improve the robustness of consensus algorithms. However, this increased connectivity also introduces significant complexity in evaluating the performance of consensus algorithms, since key metrics are typically defined in terms of Laplacian eigenvalues. Especially, estimating the largest eigenvalue of the Laplacian matrix remains a major challenge for the $r$-nearest neighbor ring networks. We reformulate the maximization of Laplacian eigenvalue as a minimization of the Dirichlet kernel problem. Firstly, we prove that the first and last lobes of the Dirichlet kernel are the deepest using the shift approach. Next, we apply local smoothness analysis and integer rounding arguments to demonstrate that there is at least one discrete sample to achieve a global minimum in that lobe. This study presents a rigorous analysis to precisely locate and compute the largest eigenvalue, resulting in exact analysis for key performance metrics, including convergence time, first-order network coherence, second-order network coherence, and maximum communication delay, with reduced computational complexity. In addition, our findings illustrate the effect of $r$ in improving the performance of consensus algorithms in large-scale networks.
Loading