Private Multi-Task Learning: Formulation and Applications to Federated Learning

Published: 12 Apr 2023, Last Modified: 12 Apr 2023Accepted by TMLREveryoneRevisionsBibTeX
Abstract: Many problems in machine learning rely on multi-task learning (MTL), in which the goal is to solve multiple related machine learning tasks simultaneously. MTL is particularly relevant for privacy-sensitive applications in areas such as healthcare, finance, and IoT computing, where sensitive data from multiple, varied sources are shared for the purpose of learning. In this work, we formalize notions of client-level privacy for MTL via billboard privacy (BP), a relaxation of differential privacy for mechanism design and distributed optimization. We then propose an algorithm for mean-regularized MTL, an objective commonly used for applications in personalized federated learning, subject to BP. We analyze our objective and solver, providing certifiable guarantees on both privacy and utility. Empirically, we find that our method provides improved privacy/utility trade-offs relative to global baselines across common federated learning benchmarks.
Submission Length: Regular submission (no more than 12 pages of main content)
Video: https://drive.google.com/file/d/1J35lwtPfLtEF3HpomNKYc3CzEW6nhgu6/view?usp=share_link
Code: https://github.com/s-huu/PMTL
Supplementary Material: zip
Assigned Action Editor: ~Antti_Honkela1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 660
Loading