Achieving Optimal Breakdown for Byzantine-Robust Gossip

ICLR 2025 Conference Submission3074 Authors

23 Sept 2024 (modified: 27 Nov 2024)ICLR 2025 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Byzantine, Robustness, Decentralized, Gossip, Averaging, SGD
Abstract: Distributed approaches have many computational benefits, but they are vulnerable to attacks from a subset of devices transmitting incorrect information. This paper investigates Byzantine-resilient algorithms in a decentralized setting, where devices communicate directly with one another. We investigate the notion of breakdown point and show an upper bound on the number of adversaries that decentralized algorithms can tolerate. We introduce an algorithmic framework that recovers ClippedGossip and NNA, two popular approaches for robust decentralized learning, as special cases. This framework allows us to generalize NNA to sparse graph, and introduce CG+, which is at the intersection of the two. Our unified analysis framework gives near-optimal guarantees for CG+ (and other approaches with additional assumptions). Experimental evidence validates the effectiveness of CG+ and the gap with NNA, in particular against a novel attack tailored to sparse graphs that we introduce.
Primary Area: optimization
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3074
Loading