Keywords: Federated Learning, Proximal Correction with Hessian and Cosine correlation, FedHC
TL;DR: FedHC
Abstract: Federated learning (FL), a prominent distributed learning approach, involves collaborative
updates among participants and individual updates on private data.
While widely-used FL methods, such as FedDC and others, traditionally rely
on first-order optimization techniques like Stochastic Gradient Descent (SGD) to
achieve convergence, there is a growing interest in leveraging second-order optimization
methods to enhance convergence in complex models. However, applying
these second-order techniques to FL models often results in convergence challenges.
To address these issues, we present an innovative integrated methodology
known as FedHC, combining proximal correction with Hessian optimization and
cosine correlation for FL. FedHC introduces the Hessian optimizer with proximal
correction to accelerate convergence. Additionally, we employ cosine correlation
to minimize learning discrepancies and bridge the gap between local and global
models. Experimental results and analyses conducted on four datasets demonstrate
that FedHC significantly accelerates convergence and outperforms existing
methods in various image classification tasks, maintaining robustness in both IID
and Non-IID client settings.
Supplementary Material: pdf
Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3657
Loading