FN-NOW: A Communication-Efficient Newton-Type Federated Learning via Low-Rank Hessian Approximation

18 Sept 2025 (modified: 12 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Federated learning, Second-order optimizer, Hessian approximation, Nyström approximation, Woodbury Formula.
Abstract: Newton-type algorithms have become a promising direction for improving federated learning (FL). Their faster convergence offers new insights into enhancing communication efficiency in FL. However, these methods rely on the full Hessian, introducing significant computational, memory, and communication overhead. In this paper, we propose FN-NOW, a communication-efficient Newton-type federated optimization algorithm based on a low-rank approximation of the Hessian. Specifically, FN-NOW leverages Nyström method and the Woodbury identity to efficiently approximate the Hessian inverse, enabling communication-efficient training through fast convergence while maintaining memory overhead comparable to first-order methods. We provide a theoretical analysis showing that FN-NOW achieves a linear convergence rate under standard assumptions, outperforming typical first-order methods. Extensive experiments demonstrate that FN-NOW consistently outperforms existing methods in terms of both convergence speed and predictive performance, making it well suited for deployment in resource-constrained FL settings.
Supplementary Material: zip
Primary Area: optimization
Submission Number: 11126
Loading