Publication
Characteristic Circuits
Zhongjie Yu; Martin Trapp; Kristian Kersting
In: Alice Oh; Tristan Naumann; Amir Globerson; Kate Saenko; Moritz Hardt; Sergey Levine (Hrsg.). Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023, NeurIPS 2023, New Orleans, LA, USA, December 10 - 16, 2023. Neural Information Processing Systems (NeurIPS), Pages 1-19, arXiv, 2023.
Abstract
In many real-world scenarios, it is crucial to be able to reliably and efficiently
reason under uncertainty while capturing complex relationships in data. Proba-
bilistic circuits (PCs), a prominent family of tractable probabilistic models, offer
a remedy to this challenge by composing simple, tractable distributions into a
high-dimensional probability distribution. However, learning PCs on heterogeneous
data is challenging and densities of some parametric distributions are not available
in closed form, limiting their potential use. We introduce characteristic circuits
(CCs), a family of tractable probabilistic models providing a unified formalization
of distributions over heterogeneous data in the spectral domain. The one-to-one
relationship between characteristic functions and probability measures enables us to
learn high-dimensional distributions on heterogeneous data domains and facilitates
efficient probabilistic inference even when no closed-form density function is avail-
able. We show that the structure and parameters of CCs can be learned efficiently
from the data and find that CCs outperform state-of-the-art density estimators for
heterogeneous data domains on common benchmark data sets.
