Keywords: Federated Learning, Black-box Prompt Learning, Large Language Model
TL;DR: Federated Black-box Prompt Learning presents a novel challenge regarding query efficiency in LLM APIs, which has not been previously addressed.
Abstract: Black-Box Discrete Prompt Learning (BDPL) is a prompt-tuning method that optimizes discrete prompts without accessing model parameters or gradients, making the prompt tuning on a cloud-based Large Language Model (LLM) feasible.
Adapting Federated Learning (FL) to BDPL could further enhance prompt tuning performance by leveraging data from diverse sources.
However, all previous research on federated black-box prompt tuning had neglected the substantial query cost associated with the cloud-based LLM service.
To address this gap, we conducted a theoretical analysis of query efficiency within the context of federated black-box prompt tuning. Our findings revealed that degrading FedAvg to activate only one client per round, a strategy we called *FedOne*, enabled optimal query efficiency in federated black-box prompt learning.
Building on this insight, we proposed the FedOne framework, a federated black-box discrete prompt learning method designed to maximize query efficiency when interacting with cloud-based LLMs.
We conducted numerical experiments on various aspects of our framework, demonstrating a significant improvement in query efficiency, which aligns with our theoretical results.
Primary Area: foundation or frontier models, including LLMs
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9093
Loading