Skip to main content Skip to main navigation

Publication

Scaling Probabilistic Circuits via Data Partitioning

Jonas Seng; Florian Peter Busch; Pooja Prasad; Devendra Singh Dhami; Martin Mundt; Kristian Kersting
In: Silvia Chiappa; Sara Magliacane (Hrsg.). Conference on Uncertainty in Artificial Intelligence, Rio Othon Palace, Rio de Janeiro, Brazil, 21-25 July 2025. International Conference on Uncertainty in AI (UAI), Pages 3701-3717, Proceedings of Machine Learning Research, Vol. 286, PMLR, 2025.

Abstract

Probabilistic circuits (PCs) enable us to learn joint distributions over a set of random variables and to perform various probabilistic queries in a tractable fashion. Though the tractability property allows PCs to scale beyond non-tractable models such as Bayesian Networks, scaling training and inference of PCs to larger, real-world datasets remains chal- lenging. To remedy the situation, we show how PCs can be learned across multiple machines by re- cursively partitioning a distributed dataset, thereby unveiling a deep connection between PCs and fed- erated learning (FL). This leads to federated cir- cuits (FCs)—a novel and flexible federated learn- ing (FL) framework that (1) allows one to scale PCs on distributed learning environments (2) train PCs faster and (3) unifies for the first time horizon- tal, vertical, and hybrid FL in one framework by re-framing FL as a density estimation problem over distributed datasets. We demonstrate FC’s capabil- ity to scale PCs on various large-scale datasets. Also, we show FC’s versatility in handling hor- izontal, vertical, and hybrid FL within a unified framework on multiple classification tasks.

More links