Time-efficient Bayesian Inference for a (Skewed) Von Mises Distribution on the Torus in a Deep Probabilistic Programming LanguageDownload PDFOpen Website

Published: 01 Jan 2021, Last Modified: 12 May 2023MFI 2021Readers: Everyone
Abstract: Probabilistic programming languages (PPLs) are at the interface between statistics and the theory of programming languages. PPLs formulate statistical models as stochastic programs that enable automatic inference algorithms and optimization. Pyro [1] and its sibling NumPyro [2] are PPLs built on top of the deep learning frameworks PyTorch [3] and Jax [4], respectively. Both PPLs provide simple, highly similar interfaces for inference using efficient implementations of Hamiltonian Monte Carlo (HMC), the No-U-Turn Sampler (NUTS), and Stochastic Variational Inference (SVI). They automatically generate variational distributions from a model, automatically enumerate discrete variables, and support formulating deep probabilistic models such as variational autoencoders and deep Markov models. The Sine von Mises distribution and its skewed variant are toroidal distributions relevant to protein bioinformatics. They provide a natural way to model the dihedral angles of protein structures, which is important in protein structure prediction, simulation and analysis. We present efficient implementations of the Sine von Mises distribution and its skewing in Pyro and NumPyro, and devise a simulation method that increases efficiency with several orders of magnitude when using parallel hardware (i.e., modern CPUs, GPUs, and TPUs). We demonstrate the use of the skewed Sine von Mises distribution by modeling dihedral angles of proteins using a Bayesian mixture model inferred using NUTS, exploiting NumPyro's facilities for automatic enumeration [5].
0 Replies

Loading