Distributed Personalized Empirical Risk MinimizationDownload PDF

Published: 25 Jun 2023, Last Modified: 25 Jun 2023FL4Data-Mining OralReaders: Everyone
Keywords: federated learning
Abstract: This paper introduces a new \textit{data\&system-aware} paradigm for learning from multiple heterogeneous data sources to achieve optimal statistical accuracy across all data distributions without imposing stringent constraints on computational resources shared by participating devices. The proposed PERM schema, though simple, provides an efficient solution to enable each client to learn a personalized model by \textit{learning who to learn with} via personalizing the aggregation of data sources through an efficient empirical statistical discrepancy estimation module. PERM can also be employed in other learning settings with multiple sources of data such as domain adaptation and multi-task learning to entail optimal statistical accuracy. To efficiently solve all aggregated personalized losses, we propose a model shuffling idea to optimizes all losses in parallel. This also enables us to learn models with varying complexity for different devices to meet their available resources.
1 Reply

Loading