Approximations in Probabilistic ProgramsDownload PDF

Published: 07 Oct 2019, Last Modified: 05 May 2023Program Transformations @NeurIPS2019 PosterReaders: Everyone
Abstract: We introduce a new language construct, "stat", which converts the description of the Markov kernel of an ergodic Markov chain into a sample from its unique stationary distribution. Up to minor changes in how certain error conditions are handled, we show that language constructs for soft-conditioning and normalization can be compiled away from the extended language. We then explore the problem of approximately implementing the semantics of the language with potentially nested "stat" expressions, in a language without "stat". For a single "stat" term, the natural unrolling yields provable guarantees in an asymptotic sense. In the general case, under uniform ergodicity assumptions, we are able to give quantitative error bounds and convergence results for the approximate implementation of the extended first-order language. We leave open the question of whether the same guarantees can be made assuming mere geometric ergodicity.
3 Replies

Loading