Modular Federated Contrastive Learning with Twin Normalization for Resource-limited Clients

Published: 11 Nov 2024, Last Modified: 11 Nov 2024Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Despite recent progress in federated learning (FL), the challenge of training a global model across clients, having heterogeneous, class-imbalanced, and unlabeled data, is not fully resolved. Self-supervised learning requires deep and wide networks, and federal training of those networks induces a huge communication/computation burden on the client side. We propose Modular Federated Contrastive Learning (MFCL) by changing the training framework from end-to-end to modular, meaning that instead of federally training the entire network, only the first layers are trained federally through a server, and other layers are trained at another server without any forward/backward passes between servers. We also propose Twin Normalization (TN) to tackle data heterogeneity. Results show that ResNet-18 trained with MFCL(TN) on CIFAR-10 achieves $84.1\%$ accuracy when data is severely heterogeneous while reducing the communication burden and memory footprint compared to end-to-end training. The code will be released upon paper acceptance.
Submission Length: Regular submission (no more than 12 pages of main content)
Supplementary Material: zip
Assigned Action Editor: ~Ahmad_Beirami1
Submission Number: 3056
Loading