ESMFL: Efficient and Secure Models for Federated LearningDownload PDFOpen Website

2020 (modified: 16 Jan 2022)CoRR 2020Readers: Everyone
Abstract: Nowadays, Deep Neural Networks are widely applied to various domains. However, massive data collection required for deep neural network reveals the potential privacy issues and also consumes large mounts of communication bandwidth. To address these problems, we propose a privacy-preserving method for the federated learning distributed system, operated on Intel Software Guard Extensions, a set of instructions that increase the security of application code and data. Meanwhile, the encrypted models make the transmission overhead larger. Hence, we reduce the commutation cost by sparsification and it can achieve reasonable accuracy with different model architectures.
0 Replies

Loading