Overcome Data Heterogeneity in Federated Learning with Filter Decomposition

21 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Federated Learning, Filter Decomposition, Heterogeneous Data, Non-IID
Abstract: Data heterogeneity is one of the major challenges in federated learning, which results in substantial client variance and slow convergence. In this study, we theoretically and empirically demonstrate that data heterogeneity in federated learning (FL) can be effectively handled by simply decomposing a convolutional filter into a linear combination of filter subspace elements, i.e., filter atoms. This simple technique transforms global filter aggregation in federated learning into multiplying aggregated (weighted sum of) filter atoms with aggregated atom coefficients. Mathematically expanding the product of two weighted sums naturally leads to numerous additional filter atom-coefficient product terms, which can be interpreted as implicitly constructing many local model variants as virtual clients. We prove that those introduced virtual clients substantially reduce variance within the aggregated model. Furthermore, our method permits different training schemes for filter atoms and atom coefficients for highly adaptive model personalization and communication reduction. Our proposed approach outperforms current state- of-the-art federated learning methods regarding task accuracy, as evidenced by extensive evaluations conducted on benchmark datasets.
Supplementary Material: zip
Primary Area: general machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 4001
Loading