FEDERATED TRAINING OF DUAL ENCODING MODELS ON SMALL NON-IID CLIENT DATASETSDownload PDF

Published: 04 Mar 2023, Last Modified: 29 Mar 2023ICLR 2023 Workshop on Trustworthy ML PosterReaders: Everyone
Keywords: Dual encoding models, Federated learning, Representation learning, Self-supervised learning, Federated self-supervised learning
TL;DR: Novel approach for training dual encoding models on distributed data composed of many small, non-IID client datasets.
Abstract: Dual encoding models that encode a pair of inputs are widely used for representation learning. Many approaches train dual encoding models by maximizing agreement between pairs of encodings on centralized training data. However, in many scenarios, datasets are inherently decentralized across many clients, motivating federated learning. In this work, we focus on federated training of dual encoding models on decentralized data composed of many small, non-IID (independent and identically distributed) client datasets. Existing approaches require large and diverse training batches to work well and perform poorly when naively adapted to the setting of small, non-IID client datasets using federated averaging. We observe that large-batch loss computation can be simulated on small individual clients for loss functions that are based on encoding statistics. Based on this insight, we propose a novel federated training approach, Distributed Cross Correlation Optimization (DCCO), which trains dual encoding models using encoding statistics aggregated across clients, without sharing individual samples or encodings. Our experimental results on two datasets demonstrate that the proposed approach outperforms federated variants of existing approaches by a large margin.
0 Replies

Loading