Model Aggregation via Good-Enough Model SpacesDownload PDF

Anonymous

16 May 2019 (modified: 05 May 2023)Submitted to AMTL 2019Readers: Everyone
Keywords: distributed learning, version spaces, privacy, model aggregation
TL;DR: We present Good-Enough Model Spaces (GEMS), a framework for learning an aggregate model over distributed nodes within a small number of communication rounds.
Abstract: In many applications, the training data for a machine learning task is partitioned across multiple nodes, and aggregating this data may be infeasible due to storage, communication, or privacy constraints. In this work, we present Good-Enough Model Spaces (GEMS), a novel framework for learning a global satisficing (i.e. "good-enough") model within a few communication rounds by carefully combining the space of local nodes' satisficing models. In experiments on benchmark and medical datasets, our approach outperforms other baseline aggregation techniques such as ensembling or model averaging, and performs comparably to the ideal non-distributed models.
0 Replies

Loading