ZOOPFL: Exploring Black-box Foundation Models for Personalized Federated Learning

Published: 01 Oct 2024, Last Modified: 05 Nov 2024FL@FM-NeurIPS'24 OralEveryoneRevisionsBibTeXCC0 1.0
Keywords: Federated Learning, Personalization, Zero-order Optimization, Black-Box
Abstract: When personalized federated learning (FL) meets large foundation models, new challenges arise from various limitations in resources. In addition to typical limitations such as data, computation, and communication costs, access to the models is also often limited. This paper endeavors to solve both the challenges of limited resources and personalization. i.e., distribution shifts between clients. To do so, we propose a method named ZOOPFL that uses Zeroth-Order Optimization for Personalized Federated Learning. ZOOPFL avoids direct interference with the foundation models and instead learns to adapt its inputs through zeroth-order optimization. In addition, we employ simple yet effective linear projections to remap its predictions for personalization. To reduce the computation costs and enhance personalization, we propose input surgery to incorporate an auto-encoder with low-dimensional and client-specific embeddings. We provide theoretical support for ZOOPFL to analyze its convergence. Extensive empirical experiments on computer vision and natural language processing tasks using popular foundation models demonstrate its effectiveness for FL on black-box foundation models.
Submission Number: 18
Loading