Internet Learning: Preliminary Steps Towards Highly Fault-Tolerant Learning on Device Networks

Published: 16 Jun 2023, Last Modified: 17 Jul 2023ICML LLW 2023EveryoneRevisionsBibTeX
Keywords: Distributed Learning, Federated Learning, Decentralized Machine Learning, Fault Tolerance, Feature Parallelism, Local Learning
TL;DR: Present Internet Learning, a paradigm for fault tolerant distributed machine learning that is inspired by the internet.
Abstract: Distributed machine learning has grown in popularity due to data privacy, edge computing, and large model training. A subset of this class, Vertical Federated learning (VFL), aims to provide privacy guarantees in the scenario where each party shares the same sample space but only holds a subset of features. While VFL tackles key privacy challenges, it often assumes perfect hardware or communication (and may perform poorly under other conditions). This assumption hinders the broad deployment of VFL, particularly on edge devices, which may need to conserve power and may connect or disconnect at any time. To address this gap, we define the paradigm of *Internet Learning* (IL), which defines a context, of which VFL is a subset, and puts good performance under extreme dynamic condition of data entities as the primary goal. As IL represents a fundamentally different paradigm, it will likely require novel learning algorithms beyond end-to-end backpropagation, which requires careful synchronization across devices. In light of this, we provide some potential approaches for the IL context and present preliminary analysis and experimental results on a toy problem.
Submission Number: 21
Loading