Painless Federated Learning: An Interplay of Line-search and Extrapolation

ICLR 2026 Conference Submission19650 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Optimization, Federated Learning, Line-Search
TL;DR: Line-search in Federated Learning for retreiving deterministic rates in expectation.
Abstract: The classical line search for learning rate (LR) tuning in the stochastic gradient descent (SGD) algorithm can also tame the convergence slowdown due to data-sampling noise. In a federated setting, wherein the client heterogeneity introduces a slowdown to the global convergence, line search can be relevantly adapted. In this work, we show that a stochastic variant of line search tames the heterogeneity in federated optimization in addition to addressing the slowdown in client-local optimization due to gradient noise. To this end, we introduce the Federated Stochastic Line Search (FeDSLS) algorithm and show that for convex functions, it achieves deterministic rates in expectation. Specifically, FEDSLS offers linear convergence for strongly convex objectives even with partial client participation. Recently, the extrapolation of the server’s LR has shown promise for improved empirical performance for federated learning. Considering that, we also extend FeDSLS to Federated Extrapolated Stochastic Line Search (FEDEXPSLS) to take advantage of extrapolation. We prove the convergence of FEDEXPSLS. Our extensive empirical results show that the proposed methods perform at par or better than the popular federated learning algorithms across many convex and non-convex problems.
Primary Area: optimization
Submission Number: 19650
Loading