sfOPA: One-Shot Private Aggregation with Single Client Interaction and Its Applications to Federated Learning
Abstract: Our work minimizes interaction in secure computation, addressing the high cost of communication rounds, especially with many clients. We introduce One-shot Private Aggregation \(\textsf{OPA}\), enabling clients to communicate only once per aggregation evaluation in a single-server setting. This simplifies dropout management and dynamic participation, contrasting with multi-round protocols like Bonawitz et al. (CCS’17) (and subsequent works) and avoiding complex committee selection akin to YOSO. \(\textsf{OPA}\)’s communication behavior closely mimics learning-in-the-clear where each client party speaks only once.
External IDs:dblp:conf/crypto/KarthikeyanP25
Loading