Discrete-continuous mixtures in probabilistic programming: Generalized semantics and inference algorithms

Yi Wu, Siddharth Srivastava, Nicholas Hay, Simon S. Du, Stuart Russell

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Despite the recent successes of probabilistic programming languages (PPLs) in AI applications, PPLs offer only limited support for random variables whose distributions combine discrete and continuous elements. We develop the notion of measure-theoretic Bayesian networks (MTBNs) and use it to provide more general semantics for PPLs with arbitrarily many random variables defined over arbitrary measure spaces. We develop two new general sampling algorithms that are provably correct under the MTBN framework: the lexicographic likelihood weighting (LLW) for general MTBNs and the lexicographic particle filter (LPF), a specialized algorithm for statespace models. We further integrate MTBNs into a widely used PPL system, BLOG, and verify the effectiveness of the new inference algorithms through representative examples.

Original languageEnglish (US)
Title of host publication35th International Conference on Machine Learning, ICML 2018
EditorsJennifer Dy, Andreas Krause
PublisherInternational Machine Learning Society (IMLS)
Pages8494-8503
Number of pages10
ISBN (Electronic)9781510867963
StatePublished - Jan 1 2018
Event35th International Conference on Machine Learning, ICML 2018 - Stockholm, Sweden
Duration: Jul 10 2018Jul 15 2018

Publication series

Name35th International Conference on Machine Learning, ICML 2018
Volume12

Conference

Conference35th International Conference on Machine Learning, ICML 2018
CountrySweden
CityStockholm
Period7/10/187/15/18

Fingerprint

Bayesian networks
Computer programming
Computer programming languages
Semantics
Random variables
Sampling

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Human-Computer Interaction
  • Software

Cite this

Wu, Y., Srivastava, S., Hay, N., Du, S. S., & Russell, S. (2018). Discrete-continuous mixtures in probabilistic programming: Generalized semantics and inference algorithms. In J. Dy, & A. Krause (Eds.), 35th International Conference on Machine Learning, ICML 2018 (pp. 8494-8503). (35th International Conference on Machine Learning, ICML 2018; Vol. 12). International Machine Learning Society (IMLS).

Discrete-continuous mixtures in probabilistic programming : Generalized semantics and inference algorithms. / Wu, Yi; Srivastava, Siddharth; Hay, Nicholas; Du, Simon S.; Russell, Stuart.

35th International Conference on Machine Learning, ICML 2018. ed. / Jennifer Dy; Andreas Krause. International Machine Learning Society (IMLS), 2018. p. 8494-8503 (35th International Conference on Machine Learning, ICML 2018; Vol. 12).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Wu, Y, Srivastava, S, Hay, N, Du, SS & Russell, S 2018, Discrete-continuous mixtures in probabilistic programming: Generalized semantics and inference algorithms. in J Dy & A Krause (eds), 35th International Conference on Machine Learning, ICML 2018. 35th International Conference on Machine Learning, ICML 2018, vol. 12, International Machine Learning Society (IMLS), pp. 8494-8503, 35th International Conference on Machine Learning, ICML 2018, Stockholm, Sweden, 7/10/18.
Wu Y, Srivastava S, Hay N, Du SS, Russell S. Discrete-continuous mixtures in probabilistic programming: Generalized semantics and inference algorithms. In Dy J, Krause A, editors, 35th International Conference on Machine Learning, ICML 2018. International Machine Learning Society (IMLS). 2018. p. 8494-8503. (35th International Conference on Machine Learning, ICML 2018).
Wu, Yi ; Srivastava, Siddharth ; Hay, Nicholas ; Du, Simon S. ; Russell, Stuart. / Discrete-continuous mixtures in probabilistic programming : Generalized semantics and inference algorithms. 35th International Conference on Machine Learning, ICML 2018. editor / Jennifer Dy ; Andreas Krause. International Machine Learning Society (IMLS), 2018. pp. 8494-8503 (35th International Conference on Machine Learning, ICML 2018).
@inproceedings{9fdde234584f4e70bcac0735339d7249,
title = "Discrete-continuous mixtures in probabilistic programming: Generalized semantics and inference algorithms",
abstract = "Despite the recent successes of probabilistic programming languages (PPLs) in AI applications, PPLs offer only limited support for random variables whose distributions combine discrete and continuous elements. We develop the notion of measure-theoretic Bayesian networks (MTBNs) and use it to provide more general semantics for PPLs with arbitrarily many random variables defined over arbitrary measure spaces. We develop two new general sampling algorithms that are provably correct under the MTBN framework: the lexicographic likelihood weighting (LLW) for general MTBNs and the lexicographic particle filter (LPF), a specialized algorithm for statespace models. We further integrate MTBNs into a widely used PPL system, BLOG, and verify the effectiveness of the new inference algorithms through representative examples.",
author = "Yi Wu and Siddharth Srivastava and Nicholas Hay and Du, {Simon S.} and Stuart Russell",
year = "2018",
month = "1",
day = "1",
language = "English (US)",
series = "35th International Conference on Machine Learning, ICML 2018",
publisher = "International Machine Learning Society (IMLS)",
pages = "8494--8503",
editor = "Jennifer Dy and Andreas Krause",
booktitle = "35th International Conference on Machine Learning, ICML 2018",

}

TY - GEN

T1 - Discrete-continuous mixtures in probabilistic programming

T2 - Generalized semantics and inference algorithms

AU - Wu, Yi

AU - Srivastava, Siddharth

AU - Hay, Nicholas

AU - Du, Simon S.

AU - Russell, Stuart

PY - 2018/1/1

Y1 - 2018/1/1

N2 - Despite the recent successes of probabilistic programming languages (PPLs) in AI applications, PPLs offer only limited support for random variables whose distributions combine discrete and continuous elements. We develop the notion of measure-theoretic Bayesian networks (MTBNs) and use it to provide more general semantics for PPLs with arbitrarily many random variables defined over arbitrary measure spaces. We develop two new general sampling algorithms that are provably correct under the MTBN framework: the lexicographic likelihood weighting (LLW) for general MTBNs and the lexicographic particle filter (LPF), a specialized algorithm for statespace models. We further integrate MTBNs into a widely used PPL system, BLOG, and verify the effectiveness of the new inference algorithms through representative examples.

AB - Despite the recent successes of probabilistic programming languages (PPLs) in AI applications, PPLs offer only limited support for random variables whose distributions combine discrete and continuous elements. We develop the notion of measure-theoretic Bayesian networks (MTBNs) and use it to provide more general semantics for PPLs with arbitrarily many random variables defined over arbitrary measure spaces. We develop two new general sampling algorithms that are provably correct under the MTBN framework: the lexicographic likelihood weighting (LLW) for general MTBNs and the lexicographic particle filter (LPF), a specialized algorithm for statespace models. We further integrate MTBNs into a widely used PPL system, BLOG, and verify the effectiveness of the new inference algorithms through representative examples.

UR - http://www.scopus.com/inward/record.url?scp=85057297082&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85057297082&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:85057297082

T3 - 35th International Conference on Machine Learning, ICML 2018

SP - 8494

EP - 8503

BT - 35th International Conference on Machine Learning, ICML 2018

A2 - Dy, Jennifer

A2 - Krause, Andreas

PB - International Machine Learning Society (IMLS)

ER -