FlexSplit: A Configurable, Privacy-Preserving Federated-Split Learning FrameworkDownload PDFOpen Website

Published: 01 Jan 2023, Last Modified: 06 Nov 2023ICC Workshops 2023Readers: Everyone
Abstract: In this paper, we address the challenges of preserving client data and model sharing privacy in a distributed machine learning system. Different from conventional federated learning where clients share their entire model, we propose a distributed learning framework named FlexSplit where each client can select the number of layers trained between it and the server to further control their individual privacy levels. FlexSplit improves scalability by performing part of the training process at multiple edge servers in parallel before the aggregation at the cloud server. Preliminary experimental results show that FlexSplit can achieve higher validation accuracy than a conventional federated learning model. We also highlight the privacy-utility trade-off where clients can increase their privacy level by sharing fewer layers to minimise privacy attacks at the cost of lower validation accuracy of their local model.
0 Replies

Loading