Keywords: Federated learning, system heterogeneity, heterogeneous model scaling, pruning, early-exit networks, client contribution valuation, Owen value
Abstract: Cross-device federated learning often faces heterogeneous clients. These clients carry data with very different values for training high-performance, generalized global models, calling for effective contribution estimation mechanisms. Width scaling with thinner subnetworks and depth scaling via early exits enable participation for heterogeneous clients but still suffer from (i) noisy aggregation across mismatched subnetworks, (ii) under-trained deep layers when few clients reach them, and (iii) costly, client-isolated contribution estimates. We propose SNOWFL, which pairs server-side single-shot pruning at initialization pruning (SNIP) with coalition-structured Owen valuation. SNIP uses a small public, unlabeled set to score connections by loss sensitivity and produce layer-consistent width masks per tier aligned with fixed early exits. During training, we estimate client contributions by first computing Owen values for coalitions and then allocating credit within each coalition via update alignment and diversity. These contribution estimates will be used in both weighted aggregation and drive capacity-aware reassignment. We prove nonconvex convergence to stationarity and, under strong convexity on the retained subspace, linear convergence to a neighborhood. Under matched FLOPs and parameter budgets, SNOWFL achieves state-of-the-art accuracy on vision and language benchmarks, improving strong heterogeneous baselines by up to 15%, while valuation remains data-free except for the small public samples used once for initialization.
Supplementary Material: zip
Primary Area: alignment, fairness, safety, privacy, and societal considerations
Submission Number: 21225
Loading