Generative modeling languages, i.e., probabilistic programming languages (PPLs), offer only limited support for continuous variables; theorems concerning semantics and algorithmic correctness are often limited to discrete variables or, in some cases, bounded continuous variables.

We show natural examples that violate these restrictions and break standard algorithms. Using *probability kernels*, the measure-theoretic generalization of conditional distributions, we develop the notion of *measure-theoretic Bayesian networks (MTBNs)*, and use it to provide more general semantics for PPLs with arbitrarily many random variables defined over arbitrary measure spaces. We also derive provably correct sampling algorithms for the generalized models and integrate them in the BLOG PPL.

Check the extended version of our paper with formal definitions and theorems here: Extended-BLOG-Semantics-Paper

Authors: Nicholas Hay*, Siddharth Srivastava*, Yi Wu and Stuart Russell (*equal contribution)