DeNAV: Decentralized Self-Supervised Learning with a Training Navigator

16 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Self-Supervised Learning, Distributed Learning, Decentralized Architecture
Abstract: Current Federated Self-Supervised Learning (FSSL) methods can achieve effective learning on edge devices with unlabeled data. However, in realistic settings, it is not easy to ensure that distributed clients at a large scale can efficiently communicate with a central server. In this work, we study an essential scenario of Decentralized Self-Supervised Learning (DSSL) based on decentralized communications. It is a highly challenging scenario where only unlabeled data is used during the pre-training stage, and the communication between clients involves only model parameters without data sharing. We propose a novel method to tackle the problems, which we refer to as Decentralized Navigator (DeNAV). DeNAV utilizes a lightweight pre-training model, namely the One-Block Masked Autoencoder, with a training navigator to evaluate selection scores for the connected clients and plan the training route based on these scores, eliminating the reliance on server aggregation in federated learning. Comprehensive experimental validation demonstrates that DeNAV surpasses the most advanced FSSL and Gossip Learning methods in terms of accuracy and communication costs.
Supplementary Material: zip
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 568
Loading