Complexity-Separated Schemes for Addressing Structured Heterogeneity in Federated Learning

15 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Non-convex optimization, Stochastic optimization, Data similarity, Composite optimization
TL;DR: In this work, we deal with the data heterogeneity bottleneck in federated learning by developing a series of complexity-separated schemes and analyze their optimality
Abstract: Federated learning faces challenges due to heterogeneity in local training sets. Existing methods typically treat this as a monolithic challenge, leading to communication overhead. In this work, we suggest examining the structure of data heterogeneity in more detail. We identify two forms of this phenomenon: mode-based, where clients differ in the presence of common versus unique data modes; and coordinate-based, where groups of model parameters vary in statistical similarity. We develop algorithms that decouple communication complexity along these structural dimensions and consequently achieve reduced synchronization frequency without deterioration in convergence. Our analysis establishes the optimality of the proposed schemes. Extensive experiments on image and multimodal classification tasks demonstrate improvements in communication efficiency over state-of-the-art methods.
Primary Area: optimization
Submission Number: 6013
Loading