$\texttt{FED-CURE}$: A Robust Federated Learning Algorithm with Cubic Regularized Newton

Published: 19 Jun 2023, Last Modified: 21 Jul 2023FL-ICML 2023EveryoneRevisionsBibTeX
Keywords: Federated Learning, Escaping Saddle point, Byzantine Resilience, Compression
TL;DR: We propose a robust and communication efficient saddle point escaping algorithm for Federated Learning.
Abstract: In this paper, we analyze the cubic-regularized Newton method that avoids saddle points in non-convex optimization in the Federated Learning (FL) framework and simultaneously address several practical challenges that naturally arise in FL, like communication bottleneck and Byzantine attacks. We propose FEDerated CUbic REgularized Newton $(\texttt{FED-CURE})$ and obtain convergence guarantees under several settings. Being a second order algorithm, the iteration complexity of $\texttt{FED-CURE}$ is much lower than its first order counterparts, and furthermore we can use compression (or sparsification) techniques like $\delta$-approximate compression to achieve communication efficiency and norm-based thresholding for Byzantine resilience. We validate the performance of $\texttt{FED-CURE}$ with experiments using standard datasets and several types of Byzantine attacks, and obtain an improvement of $25\%$ with respect to first order methods in total iteration complexity.
Submission Number: 11
Loading