FedLog: Personalized Federated Classification with Less Communication and More Flexibility

Published: 21 Apr 2026, Last Modified: 21 Apr 2026Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Federated representation learning (FRL) aims to learn personalized federated models with effective feature extraction from local data. FRL algorithms that share the majority of the model parameters face significant challenges with huge communication overhead. This overhead stems from the millions of neural network parameters and slow aggregation progress of the averaging heuristic. To reduce the overhead, we propose FedLog, which shares sufficient data summaries instead of raw model parameters. The data summaries encode minimal sufficient statistics of an exponential family, and Bayesian inference is utilized for global aggregation. FedLog helps reduce message sizes and communication frequency. We prove that the shared messages are minimal sufficient statistics and theoretically analyze the convergence rate of FedLog. To further ensure formal privacy guarantees, we extend FedLog with the differential privacy framework. Empirical results demonstrate high learning accuracy with low communication overhead of our method.
Certifications: J2C Certification
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: Camera ready version
Code: https://github.com/rossyu/fedlog
Assigned Action Editor: ~Alberto_Bietti1
Submission Number: 6234
Loading