Communication Efficient Primal Dual Algorithm for Nonconvex Nonsmooth Distributed OptimizationDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Withdrawn SubmissionReaders: Everyone
Abstract: Decentralized optimization problems frequently appear in the large scale machine learning problem. However, few works work on the nonconvex nonsmooth case. In this paper, we give a primal-dual algorithm to solve the nonconvex nonsmooth optimization problem. In addition, to reduce communication overhead, we introduce compression function. We analyze the convergence results of the algorithm and shows the algorithm meets the lower iteration complexity bound. Besides, we conduct two experiments, both of them shows the efficacy of our algorithm.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Reviewed Version (pdf): https://openreview.net/references/pdf?id=oMbPFdWztt
1 Reply

Loading