Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: GFlowNet, Federated Learning
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Generative flow networks (GFlowNets) are powerful samplers for distributions supported in spaces of compositional objects (e.g., sequences and graphs), with applications ranging from the design of biological sequences to causal discovery. However, there are no principled approaches to deal with GFlowNets in federated settings, where the target distribution results from a combination of (possibly sensitive) rewards from different parties. To fill this gap, we propose *federated contrastive GFlowNet* (FC-GFlowNet), a divide-and-conquer framework for federated learning of GFlowNets, requiring a single communication step. First, each client learns a GFlowNet locally to sample proportionally to their reward. Then, the server gathers the local policy networks and aggregates them to enforce *federated balance* (FB), which provably ensures the correctness of FC-GFlowNet. Additionally, our theoretical analysis builds on a novel concept, which we coin *contrastive balance*, that imposes necessary and sufficient conditions for the correctness of general (non-federated) GFlowNets. We empirically attest the performance of FC-GFlowNets in four controlled settings, including grid-world, sequence, and multiset generation, and Bayesian phylogenetic inference. Our experiments also suggest that, in some cases, the contrastive balance objective can accelerate the training of conventional GFlowNets.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6008
Loading