Asynchronous Federated Learning With Local Differential Privacy for Privacy-Enhanced Recommender Systems
Abstract: Recommender systems provide an effective solution to information overload. With the rapid advancement of deep learning, these systems can efficiently manage large training datasets and incorporate diverse supplementary data, thereby alleviating issues related to data sparsity and the cold start problem. However, training deep-learning-based recommender systems necessitates a substantial amount of user data, which raises concerns regarding data security and user privacy. Federated learning (FL) emerges as an innovative response to this challenge, particularly in the context of the Internet of Things (IoT). Nevertheless, existing FL recommender systems have been shown to pose potential threats during both the model training and inference phases, compromising data privacy and system robustness. This article introduces the asynchronous FL extreme deep factorization machine (AFedDFM), a state-of-the-art deep learning recommendation model that employs FL principles to prioritize user privacy. To enhance privacy protection and mitigate the risk of inference attacks, the model incorporates pseudo-interaction padding to obscure real user interactions and implements local differential privacy by adding controlled noise to the shared model parameters, thereby strengthening the model’s defenses against potential privacy breaches. Simultaneously, AFedDFM employs anti-poisoning algorithms to reduce the impact of malicious clients. Furthermore, AFedDFM effectively learns both explicit and implicit feature interactions, improves recommendation accuracy, and addresses the cold start problem. Extensive experiments conducted on two benchmark datasets demonstrate the superiority of our approach in terms of recommendation quality and privacy protection.
External IDs:dblp:journals/iotj/ZhaoBSY25
Loading