Abstract: Decentralized machine learning provides a unique opportunity to create data-driven applications without the need for large investments in centralized infrastructure. In our previous works, we introduced gossip learning for this purpose: models perform random walks in the network, and the nodes train the received models on the locally available data. We also proposed various improvements, like model sub-sampling, merging, and token-based flow control. Gossip learning is robust to failures, and does not require synchronization. Efficiency in terms of network bandwidth is also a major concern in the case of decentralized learning algorithms, especially when they are deployed in a network of IoT devices or smartphones. Here, we improve the model merging method to allow gossip learning to benefit more from token-based flow control. We experimentally evaluate our solution over several classification problems in simulations using an availability trace based on real-world smartphone measurements. Our results indicate that the improved variant significantly outperforms previously proposed solutions.
0 Replies
Loading