Abstract: Personalized Federated Learning (PFL) is an extension of Federated Learning. To better protect the data of local clients, differential privacy preserving mechanisms are usually introduced in PFL, but previous approaches directly add noise to all network parameters, which inevitably leads to performance degradation. To address this problem, we propose an adaptive differential privacy approach that adaptively determines the personalized parameters for each client. In this approach, we use a hyper network to generate the weights corresponding to each neural network layer of the local model and keep the network layer corresponding to the high weights locally as the personalized model while preventing noise interference. With this adaptive personalization strategy, it can be ensured that the locally retained personalized model parameters are not affected by the noise introduced by differential privacy. Experimental results on CIFAR-10 and FEMNIST datasets show that our proposed algorithm for personalized differential privacy is effective.
External IDs:doi:10.1007/978-981-96-5693-6_46
Loading