Keywords: federated learning, probabilistic circuits
Abstract: Probabilistic circuits (PCs) enable us to represent joint distributions over a set of random variables and can be seen as hierarchical mixture models. This representation allows for various probabilistic queries to be answered in tractable time. However, the properties of PCs so far have only been explored in the realm of tractable probabilistic modeling. In this work, we unveil a deep connection between PCs and federated learning (FL), leading to federated circuits (FCs)---a novel, flexible, modular, and communication-efficient federated learning (FL) framework that unifies for the first time horizontal, vertical, and hybrid FL in one framework by re-framing FL as a density estimation problem over distributed datasets. Also, FCs allow us to scale \textit{tractable} probabilistic models (PCs) to large-scale datasets by recursively partitioning datasets and the model itself across a distributed learning environment. We empirically demonstrate FC's versatility in handling horizontal, vertical, and hybrid FL within a unified framework on multiple classification tasks. Further, we demonstrate FCs' capabilities to scale PCs to large-scale datasets on various real-world image datasets.
Submission Number: 18
Loading