TL;DR: We propose DictPFL, a framework that ensures efficient and private federated learning (FL) by encrypting shared gradients and keeping most gradients local, while still preserving the performance of global gradient aggregation.
Abstract: Federated learning (FL) enables institutions to collaboratively train machine learning models by aggregating local gradients without sharing sensitive data. However, sharing gradients still poses privacy risks, e.g., gradient inversion attacks. Homomorphic encryption (HE) is commonly used in FL to encrypt gradients at the data owner's side, enabling secure aggregation without decryption on the server. Existing HE-based FL methods are either fully encrypted or selectively encrypted: the former ensures privacy but incurs high overhead, while the latter improves efficiency by partially encrypting gradients, leaving shared unencrypted gradients vulnerable. To enable efficient and private FL, we propose DictPFL, a framework that encrypts shared gradients while keeping most gradients local without the need for sharing all, while preserving the performance of global gradient aggregation. DictPFL comprises two modules: Decompose-for-Partial-Encrypt (DePE) and Prune-for-Minimum-Encrypt (PrME). In DePE, we decompose pre-trained model weights into a dictionary and a lookup table. Only the gradients of the lookup table are encrypted and aggregated securely while the dictionary remains fixed and is not transmitted for aggregation. In PrME, we aim to further minimize the encrypted parameters with an encryption-aware pruning technique that ensures a consistent pruning mask across clients by leveraging the history of global gradients. Experimental results demonstrate that DictPFL significantly reduces communication overhead by 402 to 748 times and speeds training by 28 to 65 times compared to fully encrypted method. It also outperforms state-of-the-art selectively encrypted gradient by lowering overhead by 51 to 155 times and accelerating training by 4 to 19 times.
Primary Area: Social Aspects->Privacy
Keywords: Federated Learning, Homomorphic Encryption, Privacy
Submission Number: 13702
Loading