GREC: Doubly Efficient Privacy-preserving Recommender Systems for Resource-Constrained Devices

27 Sept 2024 (modified: 05 Feb 2025)Submitted to ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Federated learning, recommender system, secure aggregation
Abstract: Federated recommender system (FedRec) has emerged as a solution to protect user data through collaborative training techniques. However, the real-world implementation of FedRec is hindered by two critical resource constraints of edge devices: a) limited upload bandwidth and b) limited user computational power and storage. Existing methods addressing the first issue, such as message compression techniques, often result in accuracy degradation or potential privacy leakage. For the second issue, most federated learning (FL) protocols assume that users must store and maintain the entire model locally for private inference, which is resource intensive. To address these challenges, we propose doubly efficient privacy-perserving recommender systems (GREC) consisting of both training and inference phase. To reduce communication costs during the training phase, we design a lossless secure aggregation (SecAgg) protocol based on functional secret sharing leveraging the sparsity of the update matrix. During the inference phase, we implement a user-side post-processing local differential privacy (LDP) algorithm to ensure privacy while shifting the bulk of computation to the cloud. Our framework reduces uplink communication costs by up to 90x compared to existing SecAgg protocols and decreases user-side computation time during inference by an average of 11x compared to full-model inference. This makes GREC a practical and scalable solution for deploying federated recommender systems on resource-constrained devices.
Primary Area: alignment, fairness, safety, privacy, and societal considerations
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 8695
Loading