Forward-Backward Latent State Inference for Hidden Continuous-Time semi-Markov ChainsDownload PDF

Published: 31 Oct 2022, Last Modified: 14 Oct 2022NeurIPS 2022 AcceptReaders: Everyone
Keywords: forward-backward, continuous time, hsmm, ctsmc, semi-Markov, latent state inference
TL;DR: sample-free posterior latent state inference in hidden continuous-time semi-Markov chains
Abstract: Hidden semi-Markov Models (HSMM's) - while broadly in use - are restricted to a discrete and uniform time grid. They are thus not well suited to explain often irregularly spaced discrete event data from continuous-time phenomena. We show that non-sampling-based latent state inference used in HSMM's can be generalized to latent Continuous-Time semi-Markov Chains (CTSMC's). We formulate integro-differential forward and backward equations adjusted to the observation likelihood and introduce an exact integral equation for the Bayesian posterior marginals and a scalable Viterbi-type algorithm for posterior path estimates. The presented equations can be efficiently solved using well-known numerical methods. As a practical tool, variable-step HSMM's are introduced. We evaluate our approaches in latent state inference scenarios in comparison to classical HSMM's.
Supplementary Material: pdf
14 Replies

Loading