Keywords: Federated Learning, Asynchronous Machine Learning, Low-Rank Adaptation, Continual Learning
TL;DR: In Asynchronous Federated Learning, using a library of adapter bases for updates outperforms averaging in settings of data heterogeneity and site asynchronicity.
Abstract: We consider the problem of learning to adapt a foundation model in a federated setting, particularly the most realistic and general setting: 1) When the local data sets are sampled from different distributions but we want to learn a globally adapted model, 2) Where local agents enter and leave the federation asynchronously at each time tick which is beyond the control of the learning algorithm, and 3) Where the goal is continuous adaptation so that after each time tick, the learned adapter generalizes accurately for all participants that have been seen during training. We propose a simple idea called federated library-based adaptation (LEAN) for exactly this setting. In library-based adaptation, the system maintains a pool, or "library" of so-called "basis pairs." Agents entering the federation check out basis pairs, update them, and check them in. Library-based adaptation is designed to avoid problems with more conventional methods, such as those based on averaging. In particular, we demonstrate LEAN outperforms traditional averaging baselines in both communication and computation cost efficiency across a broad range of important settings, including heavy data skewness and high asynchronicity.
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 12635
Loading